MarketBite

Posts Tagged ‘Data center


Media Tablets and Beyond.

  • Users can choose between various form factors when it comes to mobile computing. No single platform, form factor or technology will dominate and companies should expect to manage a diverse environment with two to four intelligent clients through 2015. IT leaders need a managed diversity program to address multiple form factors, as well as employees bringing their own smartphones and tablet devices into the workplace.

 

  • Enterprises will have to come up with two mobile strategies – one to address the business to employee (B2E) scenario and one to address the business to consumer (B2C) scenario. On the B2E front, IT must consider social goals, business goals, financial goals, and risk management goals. On the B2C front, which includes business to business (B2B) activities to support consumers, IT needs to address a number of additional issues such as surfacing and managing APIs to access enterprise information and systems, integration with third-party applications, integration with various partners for capabilities such as search and social networking, and delivery through app stores.

 

Mobile-Centric Applications and Interfaces.

  • The user interface (IU) paradigm in place for more than 20 years is changing. UIs with windows, icons, menus, and pointers will be replaced by mobile-centric interfaces emphasizing touch, gesture, search, voice and video. Applications themselves are likely to shift to more focused and simple apps that can be assembled into more complex solutions. These changes will drive the need for new user interface design skills.

 

  • Building application user interfaces that span a variety of device types, potentially from many vendors, requires an understanding of fragmented building blocks and an adaptable programming structure that assembles them into optimized content for each device. Mobile consumer application platform tools and mobile enterprise platform tools are emerging to make it easier to develop in this cross-platform environment. HTML5 will also provide a long term model to address some of the cross-platform issues. By 2015, mobile Web technologies will have advanced sufficiently, so that half the applications that would be written as native apps in 2011 will instead be delivered as Web apps.

 

Contextual and Social User Experience.

  • Context-aware computing uses information about an end-user or objects environment, activities, connections and preferences to improve the quality of interaction with that end-user or object. A contextually aware system anticipates the user’s needs and proactively serves up the most appropriate and customized content, product or service. Context can be used to link mobile, social, location, payment and commerce. It can help build skills in augmented reality, model-driven security and ensemble applications. Through 2013, context aware applications will appear in targeted areas such as location-based services, augmented reality on mobile devices, and mobile commerce.

 

  • On the social front, the interfaces for applications are taking on the characteristics of social networks. Social information is also becoming a key source of contextual information to enhance delivery of search results or the operation of applications.

 

Internet of Things.

  • The Internet of Things (IoT) is a concept that describes how the Internet will expand as sensors and intelligence are added to physical items such as consumer devices or physical assets and these objects are connected to the Internet. The vision and concept have existed for years, however, there has been an acceleration in the number and types of things that are being connected and in the technologies for identifying, sensing and communicating. These technologies are reaching critical mass and an economic tipping point over the next few years. Key elements of the IoT include:

 

  • Embedded sensors: Sensors that detect and communicate changes are being embedded, not just in mobile devices, but in an increasing number of places and objects.
  • Image Recognition: Image recognition technologies strive to identify objects, people, buildings, places logos, and anything else that has value to consumers and enterprises. Smartphones and tablets equipped with cameras have pushed this technology from mainly industrial applications to broad consumer and enterprise applications.
  • Near Field Communication (NFC) payment: NFC allows users to make payments by waving their mobile phone in front of a compatible reader. Once NFC is embedded in a critical mass of phones for payment, industries such as public transportation, airlines, retail and healthcare can explore other areas in which NFC technology can improve efficiency and customer service.

 

App Stores and Marketplaces.

  • Application stores by Apple and Android provide marketplaces where hundreds of thousands of applications are available to mobile users. Gartner forecasts that by 2014, there will be more than 70 billion mobile application downloads from app stores every year. This will grow from a consumer-only phenomena to an enterprise focus. With enterprise app stores, the role of IT shifts from that of a centralized planner to a market manager providing governance and brokerage services to users and potentially an ecosystem to support entrepreneurs. Enterprises should use a managed diversity approach to focus on app store efforts and segment apps by risk and value.

 

Next-Generation Analytics. Analytics is growing along three key dimensions:

 

  • From traditional offline analytics to in-line embedded analytics. This has been the focus for many efforts in the past and will continue to be an important focus for analytics.
  • From analyzing historical data to explain what happened to analyzing historical and real-time data from multiple systems to simulate and predict the future.
  • Over the next three years, analytics will mature along a third dimension, from structured and simple data analyzed by individuals to analysis of complex information of many types (text, video, etc…) from many systems supporting a collaborative decision process that brings multiple people together to analyze, brainstorm and make decisions.
  • Analytics is also beginning to shift to the cloud and exploit cloud resources for high performance and grid computing.
  • In 2011 and 2012, analytics will increasingly focus on decisions and collaboration. The new step is to provide simulation, prediction, optimization and other analytics, not simply information, to empower even more decision flexibility at the time and place of every business process action.

 

Big Data.

  • The size, complexity of formats and speed of delivery exceeds the capabilities of traditional data management technologies; it requires the use of new or exotic technologies simply to manage the volume alone. Many new technologies are emerging, with the potential to be disruptive (e.g., in-memory DBMS). Analytics has become a major driving application for data warehousing, with the use of MapReduce outside and inside the DBMS, and the use of self-service data marts. One major implication of big data is that in the future users will not be able to put all useful information into a single data warehouse. Logical data warehouses bringing together information from multiple sources as needed will replace the single data warehouse model.

 

In-Memory Computing.

  • Gartner sees huge use of flash memory in consumer devices, entertainment equipment and other embedded IT systems. In addition, it offers a new layer of the memory hierarchy in servers that has key advantages — space, heat, performance and ruggedness among them. Besides delivering a new storage tier, the availability of large amounts of memory is driving new application models. In-memory applications platforms include in-memory analytics, event processing platforms, in-memory application servers, in-memory data management and in-memory messaging.

 

  • Running existing applications in-memory or refactoring these applications to exploit in-memory approaches can result in improved transactional application performance and scalability, lower latency (less than one microsecond) application messaging, dramatically faster batch execution and faster response time in analytical applications. As cost and availability of memory intensive hardware platforms reach tipping points in 2012 and 2013, the in-memory approach will enter the mainstream.

 

Extreme Low-Energy Servers.

  • The adoption of low-energy servers — the radical new systems being proposed, announced and marketed by mostly new entrants to the server business —will take the buyer on a trip backward in time. These systems are built on low-power processors typically used in mobile devices. The potential advantage is delivering 30 times or more processors in a particular server unit with lower power consumption vs. current server approaches. The new approach is well suited for certain non-compute intensive tasks such as map/reduce workloads or delivery of static objects to a website. However, most applications will require more processing power, and the low-energy server model potentially increases management costs, undercutting broader use of the approach.

 

Cloud Computing.

  • Cloud is a disruptive force and has the potential for broad long-term impact in most industries. While the market remains in its early stages in 2011 and 2012, it will see the full range of large enterprise providers fully engaged in delivering a range of offerings to build cloud environments and deliver cloud services. Oracle, IBM and SAP all have major initiatives to deliver a broader range of cloud services over the next two years. As Microsoft continues to expand its cloud offering, and these traditional enterprise players expand offerings, users will see competition heat up and enterprise-level cloud services increase.

 

  • Enterprises are moving from trying to understand the cloud to making decisions on selected workloads to implement on cloud services and where they need to build out private clouds. Hybrid cloud computing which brings together external public cloud services and internal private cloud services, as well as the capabilities to secure, manage and govern the entire cloud spectrum will be a major focus for 2012. From a security perspective new certification programs including FedRAMP and CAMM will be ready for initial trial, setting the stage for more secure cloud computing. On the private cloud front, IT will be challenged to bring operations and development groups closer together using “DevOps” concepts in order to approach the speed and efficiencies of public cloud service providers.

read more

Advertisements

  • In a bid to strengthen its presence in the European market, India’s third-largest information technology services company Wipro Technologies opened a data centre in Meerbusch, Germany, with an investment of euro15 million (Rs 100 crore). The opening of the centre fits into the company’s strategy of going local in European market.
  • Meerbusch data center enables us to be closer to our European customers. This would help us in providing managed services and cloud offerings to our European customers who want our local presence there,” T K Kurien, CEO of IT business said.
  • As a part of the company’s restructuring in January, Kurien had also charted out a programme for the company called going local and be close to the customers. As per the strategy, the company has even appointed separate business heads for countries like Germany and France.
  • Europe is the second-fastest growing geography for the company after the APAC region.
  • According to a recent study by the Deutsche Bank and Value Leadership Group, the slowdown in economic activity in Europe, could spur the large European banks to increase their business activity with Indian IT services companies more than the global companies.

read more


  • One of the biggest costs in a data centre is power cost. Power is needed to fire the servers and storage systems. And since these systems heat up, power is again required to cool the systems with air-conditioning.
  • In the dot-com era, data centres consumed 1 or 2 MW. Now it’s common to find facilities that require 20 MW, and already some of them expect to use ten times as much in the years to come. For countries, data centre power consumption is expected to put an enormous burden on their power grids.
  • Not surprisingly, an enormous amount of innovation, right from the semiconductor level, is today going into finding solutions to reduce power consumption in computing systems. IBM’s India Software Lab in Bangalore has just contributed towards that. It has developed a system to run data centres on solar power, and is making it commercially available, perhaps the first such commercial offering in the world.
  • Until now, no one has engineered solar power for efficient use in IT,” said Rod Adkins, senior VP of IBM’s systems and technology group, who was in Bangalore last week. “We’ve designed a solar solution to bring a new source of clean, reliable and efficient power to energy-intensive, industrial-scale electronics.
  • The first implementation is being done at the Bangalore lab itself. A solar power array has been installed, spread over more than 6,000 sft of the lab’s rooftop. Kota Murali, chief scientist at IBM India, says the installation is capable of providing a 50-kilowatt of electricity for up to 330 days a year, for an average of five hours a day. The advantage of solar power is that it is DC (direct current), unlike grid power that is AC (alternating current).
  • Processors run on DC, so when you use grid power, you need to convert AC to DC. In the process of that conversion, you lose about 13% of power. On the other hand, when you do a DC to DC conversion, the loss is only 4%. So you have a saving of close to 10%.

read more


  • Companies can save millions of dollars spent on cooling data centres if they adopt newer technologies that increase efficiency, according to global chip maker Intel.
  • At present, data centre environment has temperatures between 18 and 21 degree Celcius.
  • However, many data centres run at 27 degree Celcius or even warmer without any significant impact,” Intel APAC and PRC marketing Manager Data Centre and Connected Systems Product Group Nick Knupffer told reporters here.
  • This can lead to a lot of savings for companies as they will not only save on electricity costs, but will also have lesser greenhouse gases and water wastage, he added.
  • According to estimates, data centres today consume 1.5 per cent of the world”s energy power, which generates an annual cost of USD 26 billion.
  • A data centre is a facility used to house computer systems and associated components, such as telecommunications and storage systems.

read more


  • Malaysian web hosting provider Teliti International announced on Thursday it has partnered with a group of technologies and solutions providers including Emerson Network Power to build a green data center.
  • The data center will have an ultimate capacity of 120,000 square feet and be considered Asia’s largest green data center. Located in the Bandar Enstek technology park, the data center is just 10 minutes away from the Kuala Lumpur International Airport.
  • Set to open with an initial 45,000 square feet in the first half of 2012, the data center will offer customers a range of services, including web hosting, flexible colocation, cloud computing, and fully managed information technology services such as managed storage and processing on demand.
  • The world-class technologies from Emerson Network Power and our other partners provide us with the unique ability to monitor and control every aspect of our data center infrastructure, including power, cooling, and space utilization,” said Musa Mohd Lazim, CEO of Teliti Datacentres. “This allows us to fine-tune data center operations and to observe trends in capacity utilization to ensure optimal management. In bringing together a consortium of companies with industry-leading expertise in their respective fields, we have found a winning formula for a sustainable, cutting-edge green data center.
  • Teliti will deploy a range of Emerson Network Power technologies, including UPS systems, surge protection, battery monitoring, and power transfer switches, precision cooling and cold aisle containment, power distribution units and enclosures, and access, control, and monitoring software and hardware.

read more


  • AOL has been operating a trial datacenter that runs without any on-site staff since the start of the month, and reports that the system is resilient and cuts costs.
  • Dubbed ATC, the datacenter uses off-the-shelf, pre-racked/vendor integrated gear with open source code, is run as a 100 per cent lights out facility (no BOFH patrolling the racks), and was put together in 90 days from the first proposal. AOL’s special sauce is its configuration management system, which the company says can set up and start a virtual machine in eight seconds and set up global server systems in minutes.
  • The provisioning systems were built to be universal so that if required we can do the same thing with stand-alone physical boxes or virtual machines. No difference. Same system.” Said Mike Manos, AOL’s vice president of technology operations in a blog post.
  • This system was put to the test during this summer’s East Coast earthquake, a 5.8 magnitude shock in Virginia. “The flood of inquirers to AOL’s servers in the aftermath was handled using the system, adding new virtual machines to handle the steep surge in demand,” he wrote.
  • The timing of the announcement is fortuitous, considering AOL is trying to sell itself and touting its datacenter management as one of the assets it brings to any deal. Reuters reports that COE Tim Armstrong told a meeting of top shareholders that the company could be a good buy for Yahoo! He estimated that the two companies, with a combined audience, elimination of overlaps and AOL’s datacenter prowess, could save $1.5bn.

read more


  • Nearly half of data centers are using economizers to provide natural cooling, and they are saving an average of 20 percent off their energy costs, according to a survey by the Green Grid.
  • The survey of 115 people working on data centers of 2,500 square feet or larger found 49 percent using economizers, a type of cooling that takes advantage of favorable weather conditions to cut down on energy use, and 24 percent are considering implementation. Those using economizers found an average of seven percent savings on maintenance costs.
  • The survey participants included facilities managers, engineers, IT professionals, executives and project managers.
  • There are two basic kinds of economizers. Air-side systems blow fresh air into the data center and discharge hot air back out, or use air-to-air heat exchangers. Water-side systems remove heat from chilled water loops using a heat exchange with outside air. In the survey, the most popular types of systems were direct outside air (39%) and chilled water with a water-cooled chiller (27%).
  • Those who had considered but not installed economizers cited the difficulty of retrofitting an existing facility as the biggest factor in their decision. Almost three-quarters of economizers are installed during new construction as opposed to retrofitting. Reliability was another big concern for those deciding against economizers.

read more


October 2017
M T W T F S S
« Dec    
 1
2345678
9101112131415
16171819202122
23242526272829
3031  

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 6 other followers

Twits

%d bloggers like this: