Microsoft have clearly unleashed a 1st year GCSE Graphic Design student on their Windows logo and she has crapped out this supposedly minimalist (read: lazy) monstrosity:
Microsoft have clearly unleashed a 1st year GCSE Graphic Design student on their Windows logo and she has crapped out this supposedly minimalist (read: lazy) monstrosity:
Various sites (CNet, TechRepublic) are saying that the Facebook IPO isn’t for mere mortals like you and I and not to even consider buying stock. When we finally get our chance, the venture capitalists, early investors and company employees will already have had their wallets vaccuumed clean and the price will be horribly inflated. But what if Facebook was about to change up a gear and bring its Armada of engineers to face the company that has already fired the first shot across their bows?
In the S-1 filing, the document required for private companies in the US to get the ball rolling on going “public”, Facebook stated in no uncertain terms that they were at risk from competitive products. With MySpace no longer being a threat, Bebo long dead, Friendster (who?), well you get the point, their competition is obviously Google+. Google are the ones in possession of the smoking cannon with their Facebook-a-like social network and have recently started playing dirty. Signing up to ANY of Google’s services (Calendar, Docs, GMail, etc) will AUTOMATICALLY sign you up for a Google+ account. If you’re over 18, that is.
Given the search and advertising monopoly of Google, this hardly seems fair.
A-ha. Welcome to the party. What kept you?
What would really spoil Google’s day? What would make their Google+ potshot look like a potato gun going off? And most importantly, what would give new investors something to really smile about after they put all their cash into Mark Zuckerberg’s university project?
Every website you go to, you see Facebook’s “Like” button. Every website you go to, you see Facebook’s Comments widget (just scroll down a bit to see an example…). Website owners trust Facebook enough to control how word of their sites spread. They trust them to be in control of the primary way of engaging with their readers, visitors and (this is the important one) customers.
You can see where I’m going with this.
Why wouldn’t webmasters also trust Facebook to supply the adverts on their site? Remember, the adverts that Facebook provides are very highly targeted. Just look to the right hand side of your News Feed next time you’re on FB. Scary, huh! So webmasters are going to enjoy adverts that are very likely to earn them more money. Imagine if you see an advert for a new computer game, Facebook can add a list of your friends who like this product too (using Sponsored Stories, which is already available to companies who want to pay for it!). That’ll increase the pull of the adverts. It’s all great news for webmasters. And very very bad news for Google’s AdWords product.
EXACTLY! Google are using Google+ to get inside your mind, your social circles, your likes and dislikes, your answers to open ended questions such as “What’s on your mind?”. This all allows them to provide you with more relevant search results and, the important bit again, more relevant adverts that will be more likely to be clicked on.
Facebook is in a much better position than Google, however. Google are in “learning” mode. You’re slowly transferring all your likes and tastes across to Google+ from Facebook, but Facebook already has it all. It has the historical records too which shows how your likes and tastes have evolved over time. Google+ can never give that to Google’s advertising engine.
Facebook could launch their version of AdWords tomorrow, and advertisers would drop Google like a weight lifter with sweaty hands. The benefits of advertising through Facebook will be immense.
Don’t forget, Facebook are actually pretty good at offering analytics. For anybody who has ever written an App on the Facebook platform, you’ll know about Insights – a growing collection of stats and utilities you can use to track how your application is doing.
It’s not a quantum leap from monitoring Facebook Platform Apps to monitoring users roaming around websites.
And then Google becomes nothing but what it was in 1998: A Search Engine with a cashflow problem.
I like Facebook. It has values that I identify with. I used to like Google, but then it all went very creepy, stopped being innovative and just started being reactive. Not good for a technology company!
I think Facebook would be crazy not to do this. It has amazing commercial potential while also ingratiating website owners with Mark Zuckerberg’s company.
And lastly, it would kill Google, which would solve that whole “competition” thing, that Facebook said was a real risk.
Due to my job as a software developer, I spend much of my day dealing with cutting edge technologies and watching the progress of the IT bandwagon as it barrels uncontrollably through every corner of our lives. From listening to music on your iDevice to Internet-connected TVs and fridges to Twitter on your car’s dashboard, it’s difficult to see what’s just about to happen, let alone will happen a year or more away. However, if you look at some of the emerging tech, while watching what consumers are interested in, one technology seems to have the potential to be the next secret sauce of the Internet…
On modern computers we are all used to playing 3D games and seeing 3-dimensional animations. This is usually done by one of two technologies, DirectX (on Windows) and OpenGL (on everything including Windows). On the Web, 3D usually comes down to either Silverlight 3D, Flash (typically using Swift 3D – which interestingly also exports to Silverlight!) or some esoteric plug in that you need to download and install before you can meet your friends in some virtual coffee shop in cyberspace. It’s all a bit late-90’s really.
But now there’s an answer and it comes as part of the package of technologies we’re calling “HTML5”. One of the features is “WebGL” and was born from Mozilla (home to my personal favourite browser, Firefox). As you can probably guess from the name, it is designed to bring 3D graphics to the browser completely natively (i.e. without any need for plug ins).
Why You Need to Be Excited
Smart TVs are becoming more popular and, unlike 3D TVs (at least, while you still need the stupid glasses), they make a lot of sense and will continue to grow. Web-enabling your television is long overdue, especially with the glut of online streamable media from Blinkbox, Netflix and even Youtube.
Now listen hard to what Apple are doing: iCloud, iTV, iTunes.
And then think about the rumours of the iConsole.
Imagine getting home, turning on your Smart TV and seeing that the next Call of Duty game has been released. You hit “Buy” and (because of broadband and sensible splitting up of the game) you’re playing it against all your friends almost immediately. In the browser on your television! Updates? Well, they happen automatically, just like when Facebook unveils a new feature and suddenly it’s already there waiting for you next time you log in.
And because it’s “The Web” it’s connected to everything. You can choose to post your latest achievements on your choice of social networks, put your face on your character by choosing your best pic from Flickr and you can take it with you on your WebGL enabled mobile device for playing on-the-move.
This is 18-24 months away from a technical perspective. If you look at the demos below, WebGL is probably at the standard that PC games were 10 years ago. However, PC gaming took so long to get where it is now because hardware was so (relatively) pants. Now the hardware is already available, WebGL will come of age significantly faster than DirectX and OpenGL. In fact it is being driven (and hard!) by people who have a vested interest in making this work (Google, Mozilla, Apple), while also being driven by the people who make the browsers (Chrome, Firefox, Safari) to run WebGL. It’s a perfect union of the best people in the best places to make WebGL take off.
Of course, there are commercial benefits to the browser vendors doing all of this work. Once Google knows what kind of games you buy, it can advertise those types of games to you in the future on searches and on websites with AdSense.
Once Apple see you playing games in Safari, they can sell you iCloud space for your game saves and profiles.
Mozilla, whose business model I’m not terribly au fait with, umm… well, they’ll end up with a better browser than Internet Explorer!
You can see examples of what is possible with WebGL at various websites.
It’s madness. It’s probably not even gaining traction!
Wrong! After the late Steve Jobs slammed Adobe Flash (ironically, for exactly the same reasons why I will never pay a penny for an Apple product), Adobe admitted defeat and started to wind down Flash in favour of HTML5 on mobile devices, and desktop Flash can’t be much further behind.
Adobe have even launched a new product, which enables “devigners” (developers/designers) to create Flash-like animations using HTML5, jQuery and CSS3 called Edge. This is gestating in Alpha but is free to download to anyone with a free Adobe account. As we’ve learned again and again (from PCs, gaming, Flash, Silverlight and movies) we’re never happy staying in the 2D realm and it can’t be long before Adobe introduces 3D functionality into Adobe Edge.
Let’s say you’re wrong. What else might happen?
I’m happy to be proven wrong. Life is all about learning. And in a chaotic world like “tech”, being wrong about stuff happens so often it’s hardly anything to worry about. So hedging my bets, what else could happen?
Microsoft’s cross-platform(ish) Silverlight already has great developer tools (Visual Studio!) and support. A shift in their licensing model could mean we see televisions appearing with Silverlight enabled on them (for a fee to MS from the TV maker), and for developers to use the DirectX-based Silverlight to create awesome games and experiences. I’d love for Microsoft to stop fudging what “Silverlight” actually is. Is it for Windows Phone 7 only? Can I keep making Line-of-Business apps with it? Or is it much more and developers shouldn’t be apprehensive about treating it like a general-purpose creative-content tool?
There may be too much money invested in the XBoxes and Playstations of this world for such a disruption to occur. But if the PlayStation died in favour of Smart TVs, then Sony would recoup that in Smart TV sales. What Nintendo would lose in Wii sales would be recouped in selling Web-enabled mobile gaming devices.
In fact, the only loser in a switch away from consoles would be Microsoft and their XBox. By introducing WebGL to their browser, Microsoft could keep themselves “in the game”, but I don’t consider them a particularly web-savvy company… Perhaps driven partly by a need to keep their lucrative XBox empire alive, they have declared WebGL as a huge security problem and wont have anything to do with it. Meanwhile, other browser vendors appreciate the security problems and are working towards making the technology safe. Once other browsers are at that stage of commercial robustness, Microsoft and Internet Explorer will be so far behind the curve they will be dead in the water.
With the increasing might of Google, Apple, Intel and others behind WebGL, I don’t think Microsoft’s slipping popularity will be much of a delay to the technology gaining proper traction and pushing gaming in a new, connected, social, engaging, cross-platform, cross-device direction.
If buzzwords were currency, the one worth the most at the moment is undoubtably “Cloud”. The problem with buzzwords is that nobody puts any faith in them, they’re bandwagons and may be gone tomorrow. Who wants to invest in something that is vague enough to mean pretty much anything? (There’s a “fuzzy” joke in there somewhere).
A brief rant about Carbon Tax
But there is good news, buzzwords tend to be reinventions of things we’re all already familiar with. Remember the “Greenhouse Effect”? That was humans killing the planet by wearing deodorant. When we took all the nasty CFCs out of our bodysprays and the temperature kept on rising, rather than admit they might be wrong about the human effect, scientists rebranded it “Global Warming”. People with Range Rovers and engines that produced enough power to give you a smile on your face were castigated like despot tyrannical dictators. When the planet suddenly started cool down and we suffered some of the worst winters on record, again scientists were loathe to admit they might be wrong, instead it became an even vaguer term, “Climate Change”.
This is just semantics and at the heart of it lies an age-old buzzword that we’re all familiar with; “Seasons”.
Get back to talking about Cloud, Richard
“Cloud” is no different, in fact it has a very slow-moving lineage with roots back in the 1960s when the US “Defence Advanced Research Projects Agency” (DARPA) built a distributed computer network that was intended to enable nationwide communication even in the event of Ivan bombarding the United States with Uranium-tipped missiles.
The so-called DARPANet became The Internet, an interconnected landscape of computers which used a common language to talk to eachother (“HTTP”), became attractive to consumers and normal-folk in the 1980s and 1990s with the invention of the “World Wide Web”. Tim Berners-Lee was the smart-alec behind the WWW, he was working for CERN at the time, the big European research centre that is currently smashing subatomic particles together at speeds that create impacts akin to two aircraft destroyers crashing head-on. The invention of HTML – the text language behind web pages – and Tim’s coining of the phrase World Wide Web meant that the simple act of a computer asking another computer for information, and then that second computer giving the first computer the information it had asked for, suddenly grew legs and became things like Google, Amazon and Facebook. And Yahoo!. Remember them?
That is the basis for “The Web”, a buzzword that’s now so commonplace that even our most elders are silver-surfing it. Requests for information and Responses in the form of webpages, or images, or Facebook statuses. Etc. Etc.
But What is a “Website”?
A website is content that lives on a computer somewhere on “the internet”. The computers (called “Web Servers”) that websites live on tend to be rented or leased (this equates to a monthly “hosting fee”) and you pay more for more disk space or a more capable connection to the Internet.
But isn’t that The Cloud?
Exactly my point. Although the recent surge in “cloud” focus has brought refinements to the Web concept, it is still at its heart, a Request/Response system, using HTTP to talk between computers, spitting out content (usually HTML).
The things that are new are not dramatic changes, in fact the major change is more granular control over the “content” on the Servers so the payment model has changed slightly (if you believe the marketing, this is a change for the better, but there are only a limited number of people raving about their savings). You can now store files as “blobs”, you can push your program code onto a “compute server”, and your data is stored in SQL and NoSQL databases. Each one charging slightly differently to optimise bang-for-buck.
But don’t be afraid of the terminology, there is nothing new to see here. This delineation of files/programs/data has always been there on The Web, even since the early 90’s when the only way to view web pages was on a black screen with luminous green text.
So should I invest in the cloud?
The question you need to ask is, would you invest if it was all still just called “The Web”?
I had an enlightening discussion this week with a guy who works on the same product as I do. He is a tad older, wiser and more reflective, so when he said had identified a cycle in human history, I was all ears.
Mike, who once designed a mechanism for recycling toxic waste into fuel and toothpaste by accelerating it supersonically, began by looking at unemployment figures.
“Unemployment is now at its highest for 20 years,” he began, pointing out that the current crisis sounds like the biggest catastrophe to ever befall humanity, but it’s just that the media focus is far greater and more invasive in our lives that it seems like a bigger deal.
“The nationalised industries that failed in the 70s were formed as part of a radical post-war programme of the 50s and 60s where the people that won the war realised that they were being exploited by plutocrats who were not shot at or blown to bits during the war. It was only 20 years later that they were seen as outdated dinosaurs. This played out as a violent revolution in some countries (Cuba, Vietnam, China) and as a more constitutional revolution in most of Europe.
“The traditional nationalised industries of the 70s that had kept people in jobs were shrinking and people were forced to be more creative. This generated lots of new enterprises and was the genesis of the massive growth in the financial services industry.
“In the mid-to-late 80s, the mass unemployment was as bad as now, or worse. People reacted in the same ways.”
“The difference now is that it is fuelled by the internet rather than something else.”
I was born in the mid-80s, so I’d be lying if I could remember those bad times personally, but I’m inclined to believe and respect Mike’s opinions on how they are similar.
He finished the discussion with an insight, ” I wonder if the 20 year cycle is a natural thing related to the kids of Generation N growing up and forming Generation N+1 as a way of spiting their parents…
“The sad part is that each generation is so up its own arse (back to my original email) that they think:
So let’s extrapolate this a bit to see how all of this StuffThatHasAlreadyHappened is even worth thinking about today.
The drivers of previous economic meltdowns were the collapse of large, established, respected businesses that suddenly lost the ability to sustain themselves. For whatever reason. The banks, for example, were let down by improper accounting practices that allowed money to go out that was never going to come back, but was still on the Balance Sheet as an asset. The steel and coal industries of yesteryear were obliterated by cheap offshoring and a reduction in demand thanks to synthetic materials and a boom in plastic production.
The new businesses that are the economic stars are web-based businesses such as Facebook, Twitter, Google, Netflix (already suffered death-throes this year, only to recover), Amazon, etc. Millions of businesses rely on Google Adwords for advertising, and millions of content publishers rely on Adsense to make a living. Same goes for Facebook’s ad machine.
Amazon’s Marketplace and Associates programme are buoying a similarly large number of organisations.
In “the olden days”, when a business died, it would typically take its supply chain with it. Steel businesses would also take down their shipping and sales partners. Banks died and took most of their financial partners with them. The next generation “supply chains” are all about online services.
Say Google and Amazon went dark tomorrow. They became no more. That would put a huge number of people out of work that the benefits system wouldn’t be able to support them all. Stock analysts would quiver at the sudden loss of confidence in “large, established, respected” IT stocks dropping to zero, a selling spree would signal the deaths of all the other post-IPO IT businesses. (I think Apple would be a special case in that scenario. It has a nicely diversified portfolio of hardware, software and also services (iTunes, iCloud), which should prevent it from being easily pigeonholed and tarred with the same brushes as these web-only organisations).
20 years from now, things could get very interesting. We need to keep an eye on those kids who are showing entrepreneurial talents early because those are the ones who will dictate what will be the next big thing when Mark Zuckerberg’s digital estate becomes worthless…
Do you know what date Sky Digital started in the UK? I do. 1st October 1998. How do I know this off the top of my head? Well, it all started in the summer of that year…
Our analogue Sky box had started to play up in late July, early August. I remember this vividly because I was still at school and I had spent all year looking forward to wasting my six-weeks-holiday watching every episode of Quantum Leap on The Sci Fi Channel and Clarissa Explains it All on Nickelodeon. (Ah, Melissa Joan Hart, how many teenage hours did I waste yearning to climb a ladder into your bedroom window…). Instead, I was forced to watch the monotony of a a blue screen while garbled audio played behind. My Dad took it into town to an “independent electrical specialist” who gave him some very odd advice.