Saturday, May 15, 2010

From semantic Web to being taken over by technology

So I was having a nice geeky conversation with a friend of mine about this series of books where humans could extend their ability to take input through machines, and also dump their personality into a computer and transpose it to somewhere else (like a bird). Humans are now immortals and can do things we can only imagine. We talked about enhancing what you see with information added by the machine, like you can already do with an Phone, augmented reality I believe it is called. Take a video of your location and someone will tell you this is a restaurant, here is the menu. This is a famous historical building, buy the ticket now and so on.

We went on on a lots of crazy stuff and it was a really stimulating conversations. I could not really explain why the vision made me feel a little uncomfortable. After all, God-like life with infinite possibilities is quite interesting. Something very Taoist about it, except you become the Tao. All an illusion of course, if I can be the Tao, I can be pointed to and so I cannot be the Tao. Augmented reality has something I find scary which is someone would tell you what it is you are looking at. And from now on, everyone looking at the same thing will have the same narrow view of what they are looking at.

The I was watching this video about the semantic web, it is all related, bare with me. All about how you cannot really impose one single taxonomy on the world. There is not such a thing as one way to look at something. When machine are involved I cannot help thinking they have to be programmed in a way and thus already have an interpretation of what they are looking at. If we do use them predominantly to be in the world, we risk to view everything the same way.

Another weird thought occurred to be, people are now constantly posting information about themselves (twitter and all) and we are now becoming instruments ourselves. I see more and more people being obsessed with  sharing more and more information while events are currently happening than they actually forget to process the feelings and the implication of the events itself. We are not really very good at reporting an event and understand its ramifications at the same time.

I suppose it is a long way to say the more I think about it, the more I think we need to also unplug and just be. This will lead to better life experience, fuller life experiences and better creativity. Or maybe I am just starting to feel my age :).

Ah Microsoft!

So I am trying to update my contact application. I have really been maintaining the xml directly that is a little sad. So as I am about to try the new Visual Studio (2010) I am thinking what better way to get acquainted than to update the little application and finally out enough in there that I won’t be the only one using it.

Time to discover as usual Microsoft changed everything since last time I cared. I do miss Java sometimes.

For starter, there is still not a very good way for me to store a simple database in an xml format. Entity Framework, Linq to XML, ADO.NET (those I the ones I know of) are of no help. Object serialization seems to be the only way to get it done.After all this time I can’t believe there is still nothing there do encourage the use of simple file format as the database. You can have SQL Server and you are ok but who wants that for a simple list of contacts.

Then I am looking at the application framework. So I played with Smart Client before, it was ok. Just a little too complicated for a simple application. So now I am having another look and ouch. We have PRISM (new version of smart client as far as I can tell), MEF MAF, Unity. What is a man to do? I also now discover that we have new patterns MVVM, which seems to be been created specifically for WPF (and Silverlight). Now as much as I like the separation of concerns for building an application (who has ever heard of a developer with good eye for graphics), I cannot help wonder about a new technology which requires its own new pattern (and the other ones were pretty generic to start with). The caveat of course is that if you are doing ASP.NET then you can stick with the MVC framework they have created for it.

Now one of the benefit of going with Microsoft technologies is a unified environment (as opposed to Java where everyone wants to create their own framework). If you try to apply good practices, you first need to research all the stuff Microsoft has in progress and then check 3rd party vendors one. That is already a 6 months project. You need a team just to check where we are at, what is not quite finished and where you think it is all going. You will have an aversion to use 3rd party products because Microsoft will come up with something which will kill competition and they will not exist next year. Even Java seems to have standardized on SpringFramework (for the most part only of course).
It should be easy to get started, but anything you create with the tools out of the box seems to be rooted in the 1980s. You just cannot write anything that looks good at of the box. Again, you need to look into 3rd party controls because you don’t even get a decent grid (sorting, grouping with icons) with the regular tools. I think Microsoft is slowly eroding their best competitive advantage. A set of unified tools for you to program the Microsoft world. Is Office 2010 using .NET as the macro language yet or is it still stuck with VBA?

Sunday, May 9, 2010

Too much information bad for Democracy

Sorry, I have not really had time but I saw this one and could not resist passing it on: Obama: iPod, iPad don't empower.

Not sure I agree with everything there but still interesting. I suppose busy people don’t have as much time as we do cannot process all the information that is available. I still think there is too much, we need to be able to identify what matters and focus on certain areas.

That being said, tools (like iPhone and iPad) do not make more information more available, although they do make it accessible in more places. We just need to make sure we have tools on those devices which allow us to filter that information.

Although I think the point we should make is that just because information is available that we have to look at it at anytime during the day. I have heard of people not being able to sleep anymore because they were reading news in bead on the iPhone. Of course who has not heard of family dinner where no-one is talking because everyone is texting or checking the internet. I think the toys are great but we need to learn how to switch them off.

Sunday, February 14, 2010

When your information does not belong to you anymore

image

Salesforce is now using the information in the cloud and allowing companies to setup searches to monitor contact graph and Twitter.

It seems everyone is trying to get on the cloud wagon. Google now is searching Twitter and Facebook as real time as they can (so is Bing). Facebook is trying to expose as much data as you would let them by making public the default. I suppose in the latter case they are preparing themselves for making money.

It makes complete sense to try to harness so much information into something which will will give a competitive advantage. I cannot help but think about some colossal implications.

  • If you had to be careful about what you said about your company before, once they can have such targeted searches which can be narrowed down to only your employees. Seems you’d better stay off any comments positive or negative. Even if you have positive messages but they are not in-line with your company’s marketing message, this is still not very good.
  • Once company understand the source of the information, it will not take much for them to manipulate that information. and then the information will not be very reliable anymore. But it will be a bit like Wikipedia, there is mostly good information but you cannot really take it at face value.
  • There is a huge opportunity here for some very interesting visualization multi-dimensional with evolution over time. That would be such an interesting project to work on.
  • I suppose it also becomes a tool for you to become an ‘expert’ in your field. No need for hosting conferences anymore, just have lots of opinions and you will come up a lot from the searches in your chosen field. Mind you, it is more likely you will be drowned in by the volume of the crowd.
  • I suppose this brings the next thought which is: ‘is there any need for experts?’ When you have access to all the information of your industry at your fingertip, do you have any need for an expert any more. Maybe the only expert we will ever need will be data mining experts and data visualization experts.
  • I suppose their is no need here to go on discussing the validity of the wisdom of crowds vs expert opinion and quality vs popularity.

Sunday, January 10, 2010

The Best and the Worst Tech of the Decade

 

It was the best of decades, it was the worst of decades...

by James Turner

I was looking at this article, which I enjoyed, and could not helping having some thoughts :).

The Best

AJAX - It's hard to remember what life was like before Asynchronous JavaScript and XML came along, so I'll prod your memory. It was boring. Web 1.0 consisted of a lot of static web pages,where every mouse click was a round trip to the web server. If you wanted rich content, you had to embed a Java applet in the page, and pray that the client browser supported it.

Without the advent of AJAX, we wouldn't have Web 2.0, GMail, or most of the other cloud-based web applications. Flash is still popular, but especially with HTML 5 on the way, even functionality that formerly required a RIA like Flash or Silverlight can now be accomplished with AJAX.

I love AJAX but to say that there would have be no Web 2.0, I think that is a bit much. Web 2.0 as I understand is a culture of participation and collaboration. Yes the web application are more friendly and usable, but I believe the collaboration was inevitable. Web services, XML (with or without JASON) were more important I think.

Twitter - When they first started, blogs were just what they said, web logs. In other words, a journal of interesting websites that the author had encountered. These days, blogs are more like platforms for rants, opinions, essays, and anything else on the writer's mind. Then along came Twitter. Sure, people like to find out what J-Lo had for dinner, but the real power of the 140 character dynamo is that it has brought about a resurgence of real web logging.The most useful tweets consist of a Tiny URL and a little bit of context. Combine that with the use of Twitter to send out real time notices about everything from breaking news to the current specials at the corner restaurant, and it's easy to see why Twitter has become a dominant player.

I have said enough about Twitter, staying or not it has brought a new dimension to our on-line presence and to the possibility of harnessing the opinions of the crowds. Some can be called wisdom, some no-one cares…

Ubiquitous WiFi: I want you to imagine you're on the road in the mid-90s. You get to your hotel room, and plop your laptop on the table. Then you get out your handy RJ-11 cord, and check to see if the hotel phone has a data jack (most didn't), or if you'll have to unplug the phone entirely. Then you'd look up the local number for your ISP, and have your laptop dial it, so you could suck down your e-mail at an anemic 56K.

Now, of course, WiFi is everywhere. You may end up having to pay for it, but fast Internet connectivity is available everywhere from your local McDonalds to your hotel room to an airport terminal. Of course,this is not without its downsides, since unsecured WiFi access point shave led to all sorts of security headaches, and using an open access point is a risky proposition unless your antivirus software is up to date, but on the whole, ubiquitous WiFi has made the world a much more connected place.

I am not sure we are anywhere close to universal internet access in the USA and definitely not there with WiFi. I only have an IPod Touch and the lack of connectivity is plainly felt on a daily basis. If we did I am not sure Google would be so involved in trying to give access to the white space now that we have only digital TV. We are no-where near the ubiquitous access which has been enjoyed in other countries but I believe we are doing better.

Friday, January 1, 2010

More on Twitter

I have been blogging about twitter for a while now (mainly twitter vs Facebook) and I have to say I am still baffled on how popular it is.

When not helping dealing with emergencies, and then I think the only advantage twitter has is how easy it is to post and search information, it seems to be a referral tool, a way for a user to advertise content posted at another address (which allows a lit bit more information to be posted).

Then I came across this article from someone who actually raves about Twitter and let you know why. Also there is a slight inconsistency there, the author does admit using either media to post information. I especially like the way he described posting questions and asking help from his followers.

Of course you would have to have followers, not sure I would be qualifying there. The only followers I get are the ones which seem to be disappearing from Twitter for disqualifying behavior.
He also admits to not following many people. Too much information is the same as no information. Seems a bit of a one sided relationship, but I have to say I understand the approach there.

I still think the best part of Twitter is the use of tags and its searchability. There might be a use for the individual but where it seems to shine is in using the crowd’s information. The danger of course is when it is not a fad anymore, and I hear some famous people are now disconnecting, if we do not find a way to make it more attractive for the individual, Twitter might very quickly loose its appeal.

Just found another article claiming that Twitter will be there forever. Well generally speaking it is hard to agree with it, if Twitter does not evolve it will surely disappear in favor of something else. But he makes some valid point as to why Twitter is useful now. It is worth the read.

Social media data analysis

As we put more and more data on-line, we will see different ways to analyze the data floating around. What is says about us, I am not so sure. I previously blogged about a tool to show trends in twitter.

Here is something else that was blogged on by O’Reilly (I do love those guys). This shows a different visualization of some of the information flowing in Twitter. This one also show trends but I have to say the visualization seems more effective to show what is being talked about at one point in time and the actual trends

In order to look at the trend, you need to read the text under each picture. I would be curious to see how they define boring and smart though. That might be more of a judgment call rather than a real data categorization. Actually I just read the article again and it defined by the author.

I wonder if there is a way to actually define boring vs smart. People who talk about some topics, would not really call those things boring (or would they). I suppose the x-axis might be used to represent the trends (getting hotter or cooler).

Sunday, November 1, 2009

The death of languages

I was reading this BBC article about the Death of Language. The heading reads: “An estimated 7,000 languages are being spoken around the world. But that number is expected to shrink rapidly in the coming decades. What is lost when a language dies?

The most interesting part for me was:
"What we lose is essentially an enormous cultural heritage, the way of expressing the relationship with nature, with the world, between themselves in the framework of their families, their kin people," says Mr Hagege.
"It's also the way they express their humour, their love, their life. It is a testimony of human communities which is extremely precious, because it expresses what other communities than ours in the modern industrialized world are able to express."

I could not help wondering if it is the same thing with programming languages. They are definitely a product of their time and a specific culture/idea for which they were developed. They have been evolving over time like any other language. I wonder what we would loose if for instance COBOL was to disappear or maybe something like Prolog.

To some extend, a programming language is there to help solving a specific problem. If there is another programming language which helps you do the same thing better and faster, surely you do not loose much. In a sense, a human language is not that different, it tries to address the communication problem.

Which ever way I turn the ideas around, I think it is both a shame and no big deal if we loose such thing. It is something more tangible than a nostalgia for a world/historical passage of our world. Languages try to address the way we see the world at a point in time. When we loose that language, we loose maybe a dimension of the world we look at. Every language is a compromise to express something. It reflects the choices we make and our priorities. When we have a new language, we may or may not carry this dimension with us and maybe we now are looking at the world only in a certain way and completely missing this other dimension which could have made a big difference in our perception of it.

Of course it is highly impractical to keep all of them and to some extend if everyone was speaking the same language it would be a lot easier. Being a foreigner and all, I cannot help thinking wouldn’t it be a lot easier if everyone would speak English? Or technically I don’t really care which one it is. But then I am fearful of a world limited to only English words. Maybe there is something equivalent to poetry for programming languages, that is an interesting idea. I wonder what that would look like. We know programming can produce visual and audio art. I wonder what poetry could look like. Maybe a requirement for any application should also include artistic element to it.

Yahoo not in search anymore?

I was reading this article.

The most sticking comment were from a Yahoo search expert who came from Microsoft:

“Maarek came to Yahoo from Google, where she was instrumental in the development of front-end presentation enhancements such as Google Suggest. In her view, the last decade brought a revolution in the way Web pages are crawled, indexed, and presented to the user: a revolution that saw Google come out a clear winner.”

If Bing is now replacing the Yahoo search engine, Microsoft now owns the data displayed by Yahoo. O’Reilly has been saying for a while I believe that it is all about the data. Does not really matter how much Yahoo claims it is about presentation, I can’t believe this can be correct. Presentation is just so easily copied, that can’t possibly be the only hedge.

With my limited wisdom I believe that what they are really after is another kind of data, the user habits data. If they can collect and harness that data, they can provide the kind of results I have been blogging about targeted to you. If they can achieve that I suppose that might give them the edge.

I can’t help being suspicious, if they do not own the data, they now become dependent on it. Microsoft is not known for his competitors’ friendly tactics. They might require Yahoo to give them a way to gather the same type of data. Or, knowing that Bing powers the search, why would any new user go through Yahoo instead of Bing directly? That can only mean one thing, a slowly decreasing number of users. They are going to have to find the kind of applications that will make users want to stay, some kind of integrated portal to access most of services people would want to access on-line.

If that is the case, Yahoo is becoming the provide of Software as a Service. There is a fierce competition out there with so many areas to be addressed. Who would be able to know which ones to start with and which ones they will focus on first. If I am right, which would be a small miracle, we should see more applications coming at of Yahoo starting with a better email/IM (oh wait, I think they have already done that).

Sunday, October 18, 2009

The positive geek

Not everything in the techie world is doom and gloom. Of course there is lots of it:

  • With information being so available, attention is fragmented and being are getting less productive.
  • There is no privacy left and companies will be able to buy all the information you have been accumulating on-line for the last 5 years
  • Google is becoming the new Microsoft, enough said.
  • The internet is not really helping in education and students are copying their work on-line. Are we actually becoming less educated?
  • There is so much information out there that you just cannot find anything
  • Newspapers are dying and we will forever talk about the non-relevant last big mistake from the most famous because that is all we gossip about on-line and that all the information we have time/money for anyway
  • The power of social networks may actually bring about the tyranny of crowds to the entire world.

But there are still a few of us who think there is a lot of good being achieved using technology and the best might still be coming. After all, it could be argued the Internet is still in its infancy.

  • There is now greater transparency of information. Even though sometimes it might look bad, it is a good thing. Democracy can only move forward with open access to all the information about how the system is run.
  • We have creative ways of using the information using mash-ups. Maps of crime, Google earth/sky/stars.
  • There is still so much that can be done and yes, there have been really good use of Twitter in natural disaster cases. Sometimes that was the only mode of communication. Same with Facebook during the Iranian elections.
  • Information is accessible from so many different places and smartphones are supposed to be replacing PCs sometime in the near future.
  • In poor countries, people use cell phones to transfer money, sell and buy new products.
  • I am not the only one saying it. This might be the most convincing argument of course.
  • My biggest hope is that there is still so much we have not seen and that is yet to come.