About This Case

Closed

28 Dec 2009, 11:59PM PT

Bonus Detail

  • Each Selected Insight
    Earns a $100 Bonus

Posted

20 Dec 2009, 11:27PM PT

Industries

  • Enterprise Software & Services
  • Government / Politics / Global Issues
  • Hardware
  • IT / IT Security
  • Real Estate
  • Telecom / Broadband / Wireless

IT Predictions for 2010 And Beyond

 

Closed: 28 Dec 2009, 11:59PM PT

Earn up to $100 for Insights on this case.

As 2010 looms, we're continuing our series of cases here to develop interesting, engaging and useful discussions for our new sub-site, IT Innovations. We're looking for insights that might help IT managers stay informed and keep their operations competitive.  

For this case, we're looking for your predictions for the upcoming year for data centers or IT management.  What changes do you anticipate for 2010?  What are you most looking forward to?  What trends from 2009 will really pick up in 2010?  What events are you planning for in 2010?  These are just a few of the questions we'd like to see answered.  It's hard to look into the crystal ball and predict the future with certainty, so if you want to evaluate past predictions and correct the predictions of popular IT pundits, feel free to do that, too.  Our main goal is to try to offer an insightful and practical outlook for the year ahead. 

This case isn't necessarily restricted to just 2010, either.  If you have predictions for the distant future, please share your thoughts on what you think will happen in the next decade and beyond as well.

7 Insights

 



Kishore Jethanandani has been a member of the Insight Community since December of 2009. Kishore's web site is http://kishorejets.typepad.com.
icon
Kishore Jethanandani
Mon Dec 28 8:35am
Much of the business value of mobile cloud computing, unified messaging, multi-media communication and social networking will be mobile collaboration and the higher productivity it will yield. More people are on the move and geographically dispersed and need to remain connected even when they are not in close proximity of their colleagues.

Widgets and the iPhone are changing the game for data centers. Making mashups with other applications, such as Facebook or the Google ever-extending suite, is becoming easier and easier. Adding the crucial sauce which makes the company add value is most important function of the data center in the future.

Companies are realizing that it is cheaper to change the widgets, the platform, and the back-end logic separately. APIs will have to become more persistent, or extensible, and that is an area where there will have to be a lot more development. In a mass market, the experience from the last few years should be that you can not just hope that all users will change the application when something changes. Automatic updaters work, but only to a limited extent (and not, so far, for widgets). There will always be some users who refuse to change, and who will require backwards compatibility at some level - there is a reason that browser updates are concurrent with PC updates.

Managing the API will become the central function of the CIO and IT management in companies, and then a modularized (and virtualized) back-end can be upgraded slowly, moved to locations which work best, from whatever perspective may be most relevant for the organization. That is another big thing that happened this year: Data centers became location-agnostic. Power, network, and a friendly location is all that is needed - and then they can basically move anywhere on the planet. This has been possible before, this year it became evident.

Managing the API also means managing the mashed up resources. If you built a company around a sales application that all of a sudden disappears, what do you do? Managing the network becomes a crucial task, as does managing the resources used - the mashued up resources. But how do you make sure the cloud does not go away? It means getting assurances from the resources used that they will continue to exist and be usable, whatever happens - they must have their own data centers, with hot standby and all the old mainframe mainstays.

Modularizing mashup resources, placing portions in the data centers of corporate customers, will be one possibliity (essentially, becoming the cloud). Another possibility is that the cloud operators will be trusted and trustable, so nobody has to have their own data centers. But that is hard to believe. More data centers, better managed, with new software features as software development continues (and new protocols to communicate between them) is more likely - and a more interesting problem and possibility for data center managers.

Heading into 2010, there are five things to watch which will dominate the field of technology. There are some significant issues which have developed in the last few years, and 2010 will see these come to a head, if not a resolution.

The Cloud
As we head into 2010, the cloud will continue to dominate the scene of technology.  This will be hastened by the development of mobile devices.  Its real traction will depend on the accessibility of wireless and broadband services.  At the end of 2009, urbanites are enjoying the availability of low cost broadband services, while rural area still struggle with getting access to affordable high speed internet.  Secondly, the cost and terms of data plans hold back the development of additional technologies and services. 

Net Neutrality

Internet providers continue to test models of making money, and 2010 will see the rise of not so good models, and government stepping in and drawing some clear lines of authority over service providers and how services are provided. 

Access to Broadband
As various European countries toy with plans to combat piracy by using 3 strikes type laws, the United States will look towards drawing bright lines of whether access to the internet is a luxury or necessity.  Falling with the latter, the US will need to make significant investment in infrastructure to ensure rural areas get access to reliable low cost internet. 

Privacy
A perennial issue, privacy will continue to top issues of 2010.  As users become more aware of how their information is used and shared and the value of their information, they’ll become increasingly fickle about companies which abuse their good will.

E-books
2009 saw the rise of the e-book, while 2010 will see that field broaden initially with the cream rising to the top.  The market will face many of the issues which music has been facing of the years, ownership, distribution, resale rights and DRM.  We will see some companies moving to more proprietary formats and locked down systems, forcing users through their channels to access or add data to their readers.  However, those companies which embrace open accessible formats will gain broader acceptance by users, though may fall afoul of publishers.  The seeds of these issues have been sewn in 2009 and 2010 will see them blossom.

2010 promises to be an exciting (and interesting) year in technology.  We will see whether grass roots efforts can beat out the strength and money of the telecom companies, and whether privacy, openness and accessibility really are important to users.

Joshua Howe is a Certified Rehabilitation Counselor (CRC) specializing in accessibility and technology for individuals with disabilities.
icon
James Stevens
Tue Apr 6 2:29pm
Since the US is so large geographically, to Broadband is quite the challenge (especially here in Texas). About 60-70% of Americans have broadband access.

The answer to these rural areas, where it's not logical for ISPs to spend thousands to bring lines to homes, is definitely terrestrial wireless solutions. This market is still very young and untapped and competition for rural customers will heat up in the next few years.

These small business WISPs are much more efficient, logical, experienced, flexible, and tested in these rural markets compared to large ISPs or government stimulus backed projects to expand to rural areas.

This past year has been one heck of a shift for computing in general.   But the big factor has been in cloud computing.  While not a new technology by any means with distributed computing going back at least a thirty in academics, the idea of providers having a huge resource as an "entity" and users can just use what they wish from it is a fresh new consumer model.

In fact, before 2009, Amazon had already began its conquest in the supply chain side with AWS which currently dominates the cloud environment and in 2010, they're focusing on provision of private clouds.   Formerly known as Mosso, Rackspace Cloud is also on the rise in providing a great environment although they don't quite have the vastness of Amazon.

Another ideal move in favor of cloud computing has been both with the Android OS and netbooks.  Many applications now are driven by Google's online presence and the wide popularity of netbooks will continue to drive the cost of hardware down and assist in convincing the general populace where to store their data next.  Even with cell phones that run Android OS are now downloading a copy of the contacts directly from online and syncing up instead going through the hassle of shifting from cell phone to cell phone and having to move those contacts through cabled software.

With these types of suppliers springing forth with strong capital backing, analysts are already forecasting a shift in the paradigm of computing with many applications moving from standalone to cloud.   And this train has only just left the station.

Chief Strategy Officer of Firelace, Inc. Our flagship product is Merchant's Mirror (http://www.merchantsmirror.com), an online accounting suite for small businesses.

In broad terms I'd argue that a major story of 2009 and perhaps *the* major story is the near total acceptance of the cloud computing paradigm as a powerful approach to building infrastructures and solving many IT problems.   

Although "the cloud" was hardly a new theme for 2009, widespread acceptance of the versatility and cost effectiveness of this approach has only come in the past few years, and 2009 was arguably the first year when this approach has enjoyed very widespread acceptance.

Few enterprise IT managers would now balk at the idea of running large scale solutions from within the cloud, where even a few years ago this approach might have been rejected as too risky or unstable.    

Many factors have contributed to this widespread acceptance.   In economic terms the cloud has enabled more effective cost management and resource allocations as computing becomes more of a utility expenditure and companies can more easily scale their computing to meet their changing needs.  

Sourcing even major enterprise applications to external data centers has become commonplace and in many cases the obvious solution, especially for startups or modest sized companies that cannot yet afford a robust internal infrastructure.   Even for companies that can afford this it may be advantageous to use the stability and scalability of alternative "in the cloud" solutions for most of their IT needs.

However I think that an excellent record of stability has been the key reason for the widesread acceptance of cloud computing.  Avoiding downtime is arguably the biggest and most conspicuous IT challenge.   Where the "old" cloud often offered more risk than in house alternatives, the current infrastructure in most remote IT centers offers nearly perfect uptime records thanks to a variety of hardware and application improvements such as load balancing routines and superior virtual server software.  

Finally, I think the cloud forms an important backstory to other major IT developments of the year such as the phenomenal rise of Twitter and Facebook as key social mechanisms for a rapidly evolving online world.    Although social networking does not require a cloud infrastructure, the existence of the cloud has created much more fertile conditions for the development and fast spread of services like Facebook and Twitter.  Both have development ecosystems that benefit from low costs and scalability cloud computing has brought to the online table.   Also, users have quickly come to rely on applications such as gmail and Google docs where their information is no longer held captive on a single machine and is available from our growing number of devices and available for work or social collaborations.

In no small measure thanks to the widespread acceptance of the cloud we've seen in the past year we will be better prepared to usher in a cheaper and more interesting IT 2010.

Happy New Year!   

Publisher of travel, history, and news at several regional and national websites and blogs including "Technology Report". Annual Technology conference coverage includes Consumer Electronics Show in Las Vegas and Search Engine Strategies San Jose.

Artificial General Intelligence - AGI, is for many the holy grail of computing.    Humans have enjoyed consciousness and "self awareness" for some time, but to date no machine has attained a comparable level of intellectual ability despite huge advances in specific fields like gaming and math where computers dramatically surpassed human abilities long ago.

No longer relegated to science fiction, many now believe that the question is not whether or not humans can develop an artificial intelligence with human-like intellectual abilities, the only question is "when".

Ray Kurzweil, author of "The Singularity is Near" and the upcoming film "Singularity"  is among the most optimistic experts on the topic of general AI.   He sees general AI developing in the twenty five years, probably followed in very rapid succession by an "explosion of machine intellect" and a "Technological Singularity" which will deliver a world of almost unimaginable technological sophistication. 

Last year I had the chance to speak with several brilliant insiders about the topic of "conscious computing" which many would argue will be similar in form and function to "AGI".  I was surprised by the diversity of opinion, though all seemed to agree that it was only a matter of time before we are likely to see machines that think much like we do.    

Marissa Mayer of Google, whose graduate work at Stanford was in AI, suggested a time frame of about ten years for conscious computing.    Mayer noted that Google researchers were intrigued by some of the current algorithmic outputs from the search routines which are obviously not AI but do in fact look like the type of output you'd expect from an intelligent agent.   Her optimistic timeframe is consistent with Kurzweil's notions that computing hardware and software improvements are proceeding at exponential rather than linear rates and therefore we are likely to see major gains happen over shorter and shorter timeframes. 

Matt Cutts, a prominent Google Search engineer, was not as optimistic as Mayer, suggesting it could take "45" years given the incredible complexity of the algorithms needed to duplicate human style intelligence.   However Cutts also noted his background is not in AI.

Perhaps not surprisingly the most vague answer I got was from Google's own AI legend, Peter Norvig, at the Convergence conference in Mountain View in 2008.     He was not even comfortable that we could define "consciousness" well enough to use it as a milestone, so to him the question of "when" was in some ways irrelevant.    However, I think even Norvig would agree that human quality intellect is not likely to remain only within the human sphere too much longer. 

Two of the most promising projects in the field of general AI are Darpa SyNAPSE and Blue Brain. In simple terms these projects highlight two different approaches to developing a human-like artificial intelligence.    For SyNAPSE the AI algorithms and software and computational power are the key focus    IBM recently announced the project had essentially created equivalent computational power, and arguably thus the "intelligence" of a "cat brain" using their IBM Blue Gene Supercomputer in Silicon Valley.  

Challenging IBMs assertion was Dr. Henry Markram, the architect of the Blue Brain project of Lausanne, Switzerland.  The Blue Brain approach is more along the lines of reverse engineering the functions of a human brain, incorporating various human-like signals as well as the massive processing power of an IBM Blue Gene Supercomputer. 

Regardless of the mechanism of development it's reasonable to assume humanity will have a conscious, self-aware computer developed with the next few decades.   Although some fear the advent of an "unfriendly AI" that would threaten the very existence of its creators, I think the lesson of our own intellectual development over many centuries is that intelligence is likely to spawn greater compassion and perhaps even bring the solutions to problems that have heretofore seemed insurmountable given our feeble human intellectual constraints.  

 

 

 

Publisher of travel, history, and news at several regional and national websites and blogs including "Technology Report". Annual Technology conference coverage includes Consumer Electronics Show in Las Vegas and Search Engine Strategies San Jose.

Here's two predictions for 2010. First, in general security will get much better - but the challenges for IT managers will also get much more complicated.

The year ended as Barack Obama finally selected his own cyber-security czar -- but that's just part of the federal government's plan to modernize their role in the security of the internet. In October the Department of Homeland Security opened the National Cybersecurity and Communications Integration Center, promising "a 24-hour, DHS-led coordinated watch and warning center that will improve national efforts to address threats and incidents affecting the nation's critical information technology and cyber infrastructure." And meanwhile, the Pentagon has even established a new Cyber Command.

But the federal government has been also leading the push to protect private information. Back in 2006, the Office of Management and Budget mandated that government agencies encrypt all the data on that's stored on mobile devices. And mre than half the states in the country have beefed up federal privacy laws with their own new state privacy regulations... But ironically, one of the biggest factors in improving security may have been: dramatic news stories about weak security. In 2007, there were 800,000 social security numbers that were stolen when a 22-year-old intern lost a back-up tape he'd stored in the back of his car -- and a few weeks later, data was also stolen for every National Guard soldiers in Idaho. Later that year, the security administrator at the Oregon State Treasury joked to me that some organizations were implementing stronger security protocols simply because "People like to stay off of the front page!”

I've wondered if there might be fewer breaches at datacenters in 2010 as an indirect result of the financial bail-outs in 2008. Public anger is high, and it's being directed at nearly every institution in the financial industry. So it's the worst possible time to have to explain to customers that your data's been breached -- and I predict managers will take data security much more seriously. And if not for their customers, then for potential business partners -- since the obvious result of the financial crisis has been mergers and acquisitions.

I predict IT managers will face a more complicated environment -- partly because of the growing adoption of mobile devices. The popularity of the iPhone has spawned high-powered competitors like the Android, and more and more end users are going to have them. But if the future is mobile devices, that means more users trying to connect to the corporate network with unpatched and potentially virus-infected personal data accessories. If you don't have a good network access protocol, you'll be facing a whole new class of threats -- and users also need to be educated on the dangers of using their new mobile devices to transport sensitive work data out of the office. The growth in mobile devices may also lead to a great emphasis on encryption -- to protect data from the careless habits of end users. but IT managers also need to focus on network access control, to deal with threats that arise when those same users try to re-connect!