Friday, September 28, 2012

Symantec's Backup Exec 2012 Brings Virtualization to Data Backup Process - Enterprise Networking - News & Reviews - eWeek.com

Symantec brings easy restoration and virtualization to the crowded backup software market with Backup Exec 2012. The product focuses on offering everything needed for physical and virtual server backups, while incorporating disaster recovery elements that speed data recovery times from hours to minutes across corporate data centers and networks. Integrated physical-to-virtual (P2V) support allows the product to function as a virtual server migration platform as well as a server backup system. A new GUI, along with an end-user definable dashboard helps to bring an “at a glance” capability to the product’s monitoring and management screens. Physical, virtual and hybrid server support is included, as well as a new process for creating server groups. Other notable capabilities include drag-and-drop task definitions that eliminate the need for manual policy creation and improved restoration wizards that bring simplicity to the disaster recovery process. Priced at $1,662 per license, Backup Exec 2012 now includes bare metal recovery and P2V capabilities, which were once additional-cost items.

Read More

Thursday, September 27, 2012

Western Digital Releases SAS, SATA Hard Drives

WD launches the RE SAS and SATA line of hard drives, featuring NoTouch technology and up to 4 TB of capacity.

As businesses demand ever more storage drives to fill the volumes of information they generate and receive, Western Digital, a provider of data storage solutions, announced the expansion of its enterprise-class storage offerings with the WD RE SAS and WD RE SATA hard drives in capacities up to 4TB. The 3.5-inch drives are also offered with 250GB, 500GB, 1TB, 2TB and 3TB versions for the SATA appliances and 1TB, 2TB and 3TB versions for the SAS configuration. All drives are backed by a five-year limited warranty.

Both drives offer dual port, full duplex connectivity, Dual Stage Actuation (DSA) and Rotary Acceleration Feed Forward (RAFF) for improved operation and performance when drives are used in vibration-prone, multi-drive chassis, and the company’s NoTouch ramp load technology, which means recording heads never touch the disk media, helping to ensure less wear to recording heads and media as well as better drive protection in transit. On the green technology side, the drives meet Restriction of Hazardous Substances (RoHS) directive compliance, through the use of halogen-reduced components.

"Given the insatiable need for storage capacity across all market segments, WD is offering both SAS and SATA interfaces for the WD 4TB RE hard drives to best support both private and public clouds," Doug Pickford, senior director of business marketing for WD's enterprise business unit, said in a prepared statement. "WD continues to pioneer the capacity-optimized 3.5-inch market segment, in particular, and the WD RE SAS and SATA 4TB drives are designed, tested and optimized for enterprise storage and applications, enabling 33 percent greater capacity than previously available drives and up to 2.4PB of raw capacity in a single enterprise rack."

The drives deliver 6 gigabit per second (Gb/s) transfer rates, sustained sequential data rates of up to 182 MB/s and high random input/output (I/O) rates and have been field-tested for a 1.2 million hour mean time between failure (MTBF) rate. The WD RE SAS and SATA drives are compatible with Microsoft’s Windows 7, Vista, XP, 2000, Windows server, Linux, and Apple’s Mac OS platforms, according to a company information sheet.

"High density storage, low power consumption, and reliability are crucial for cloud, big data and data center infrastructures and services," Andy Morgan, senior director of storage platforms at storage equipment specialist Xyratex, said in a press release. "Through the early qualification of our partner's enterprise products like WD for compatibility with our own OneStor solutions, we are providing our OEM customers a means to address these requirements and profit from these growing markets."

Read More

Smartphones Increasingly Use DRAM Chips: iSuppli

An IHS iSuppli report finds smartphone manufacturers are turning to DRAM chips as their handsets become more sophisticated.

With smartphones becoming more sophisticated and encompassing ever more functionality, the average use of dynamic random access memory (DRAM) in these devices is expected to increase by nearly 50 percent this year, according to an IHS iSuppli DRAM Dynamics Market Brief from information and analytics provider IHS. The report found the DRAM configurations dissected in phones through the company’s Teardown Analysis service, which confirmed a trend of increasing DRAM loading in the devices.

In those analyses, all of the handsets dissected had at least 1,024MBâ€"or 1GB of DRAM, with the exception of the Nokia Lumia 900 and the Apple iPhone 4S, both of which featured 512MB of DRAM. That marks a dramatic change from a year ago, when none of the phones analyzed had more than 800MB of DRAM. Overall, the report projected average DRAM content in smartphones would expand to 666MB in 2012, up from 453MB in 2011 and 202MB in 2010.

However, it was unclear why the Lumia handset had less DRAM than other smartphones, although the report noted it was the only device tested that was running a Microsoft Windows operating system. Samsung’s Note and Galaxy S III devices, the Motorola Droid Razr XT912 and the One X from HTC all contained at least 1GB of DRAM, while the DRAM for Apple’s recently released iPhone 5 has risen to 1,024MB due to new features added to the phone, including a more powerful processor and more memory-intensive applications, the IHS report noted.

“As smartphones become more sophisticated, memory usage in the devices continues to riseâ€"not only to satisfy user wants and needs but also to accommodate demands made by ever-more powerful processors and increasingly refined LCD screens,” Clifford Leimbach, analyst for memory demand forecasting at IHS, said in a press statement. “And as memory has increased in smartphones, the industry has moved from a complex world featuring differing memory densities, to a simpler space where phones look increasingly similar from a memory perspective.”

Density in DRAM memory is also expected to increase and has climbed already, the report noted, with 4-GB chips encompassing roughly a 37 percent share of the DRAM market for smartphones, followed closely by the 8GB chip configuration with 36 percent. Those standings are expected to reverse next year, with 8GB chips rising to command 46 percent share among smartphones, while share of the 4GB chips will decline to 28 percent, the report said.

“An even higher-density configuration, the 16GB, is forecast to take the lion’s share in the years ahead, indicating that DRAM density growth will continue uninterrupted for some time to come,” the report projected. “While 16GB DRAM will account for just 2 percent market share in 2012, its portion jumps to 15 percent next year, overtakes 4GB and 8GB by 2013 and 2014, respectively, and then becomes the primary density configuration by 2015, with 56 percent market share.”

Read More

Toshiba Launches Its First Hybrid Drive for Notebooks

The new drive aims to provides the best attributes of solid state disks (speed and responsiveness) with the capacity (up to 1TB in storage) and cost-effectiveness of hard disk drives.

A recent trend in notebook and desktop computers has involved the dual-drive PC, in which the portable device uses both a NAND flash solid-state drive and a fast hard drive. The swift SSD drive handles the BIOS and bootup, and the HHD does most of the remaining work.

Toshiba, the inventor of NAND flash 21 years ago, came out Sept. 25 with an alternative to this: its first hybrid drive for laptops and desktop PCs. 

Toshiba's Storage Products Business Unit said it has started customer shipments of its first Hybrid Drive, technically called the MQ01ABDH series. The new drive aims to provides the best attributes of solid state disks (speed and responsiveness) with the capacity (up to 1TB in storage) and cost-effectiveness of hard disk drives. HHDs typically cost about one-fourth of solid-state drives.

'Self-Learning' Software

The new 2.5-inch, 9.5mm-high SATA Hybrid Drive series features so-called "self-learning" caching algorithms that learn the system user's data access patterns in order to optimize performance, Patty Kim, product marketing manager at Toshiba's Storage Device Division, told eWEEK.  

The caching algorithms also manage which user data is stored to the NAND flash drive for quick response to the near-future access from the host, as well as how the data in the flash drive is updated,  based on intelligent access pattern learning, Kim said.

"We think the hybrid drive will overtake the dual drive (for laptops and desktops) because there's no special driver needed, it's OS-agnostic, specific chipsets aren't necessarily needed -- whereas with dual drives you need specific types of chipsets," Kim said.  "It works like a normal drive, but you get the surprise of SSD-like performance."

Since the hybrid drive is one drive instead of two, naturally the power consumption is cut way back, so battery times are substantially improved.

One SATA Interface, Less Complexity

"There's also only one SATA interface, so you reduce a lot of the complexities and get better performance," Kim said.

Toshiba's hybrid series come in 1TB and 750GB1 capacities. Kim said the MQ01ABDH 100 and MQ01ABDH 075 models are suited for ultrathin and standard-size  notebooks, gaming PCs, all-in-one and slimline desktops and other digital computing applications. 

The new hybrids will be featured in various notebook PCs, with the first systems shipping in time for the holiday season, Kim said.

Read More

Wednesday, September 26, 2012

Hitachi Data Systems Launches New Midrange Storage Package

HDS's HUS VM encompasses external virtualized capacity of up to 64PB and consolidates block, file and object data natively onto a single platform for secure data services.

Hitachi Data Systems on Sept. 25 launched a new unified network storage software package for small and midrange businesses that aims to provide enterprise-quality virtualization for all data types.

The Hitachi Unified Storage VM storage software package -- which not only works with all HDS hardware but also with storage arrays from other manufacturers -- encompasses external virtualized capacity of up to 64PB and consolidates block, file and object data natively onto a single platform for secure data services.

Bringing Virtual Storage into a 'Net New Product'

"What we're doing is bringing the system software from our virtual storage platform into a net new product targeted at small and medium-size enterprises," Mike Nalls, a senior product marketing manager at HDS, told eWEEK. "The needs of a small-to-medium size business are similar to a large global multinational, only their capacity requirements may be more modest."

A longtime frustration for storage administrators at SMBs has been a lack of scalable, agile storage systems without capital costs that go through the roof. IT managers with multiple duties have always struggled with the same demands as large enterprises, only with even tighter budgets, Nalls said.

Smaller enterprises need storage systems that consolidate and simplify data management yet have good performance and reliability and improved serviceability. Until now, Nalls said, small and medium enterprises have been forced to accept tradeoffs, such as compromising reliability and more effective virtualization of their data in lieu of limited budgets.

HDS is offering volume capacity with this new system. The HUS VN can hold up to 1,152 drives in one frame; as a comparison, IBM's XIV system holds 180 drives, Nalls said.

Key features of the HUS VM include the following, according to HDS:

  • Consolidates block, file and object data natively onto a single platform for fast delivery of secure data services;
  • Unified platform allows users in different departments to share resources for various applications to lower capital investment and lower operation cost;
  • Hitachi Command Suite unified management reduces the time required to administer storage for multiple applications;
  • Supports solid state drive options now and will extend the recently announced HDS flash strategy with integrated flash devices based on the new Hitachi flash memory controller unveiled in August;
  • Enables up to 40 percent lower environmental cost, due to power efficiency and space-saving design.

Built from the Ground Up for Midrange Market

"One of the key advantages of HUS VM -- beyond being unified -- is its virtualization capabilities," said 451 Research Senior Storage Analyst Henry Baltazar. "HDS built this product from the ground up to address specific customer requirements around virtualization and scale, consolidating the storage infrastructure under a single management framework at a lower price point.

"With this move, HDS is enabling users to start reducing their licensing fees on older storage systems. Therefore, as an aggressive move against NetApp and EMC; this new product makes perfect sense for HDS."

HDS also is extending its 100 percent data availability guarantee for the new system, something that no other competitor has yet offered to midrange-size businesses, Nalls said. The new HUS VM system is available now. Go here to see a data sheet of more information, including pricing.

Read More

Symantec Unifies Physical, Virtual and Hybrid Backups With Backup Exec 2012

The latest iteration of Backup Exec brings advanced backup capabilities to multiple storage technologies. It also improves the data backup and restoration process for IT administrators using virtual, physical, and hybrid storage technologies.

Symantec’s Backup Exec 2012 has evolved from its once humble roots in to a complete backup and recovery environment that works across platforms, storage technologies and even the cloud.

Although the core Backup Exec application has been around for decades, the latest iteration feels anything but old. Symantec has given new life to the product with a major redesign that combines ease of use with the sophistication to handle multiple, concurrent backup events from a variety of devices and storage technologies.

I last reviewed Backup Exec several years ago and was surprised at how much the product has changed. The latest version, Backup Exec 2012, has a starting retail price of $1,662 and features a completely redesigned GUI, along with a plethora of new and enhanced features.

I installed Backup Exec 2012 into a multi-server Windows environment that consisted of two Windows Server 2008R2 systems, as well as three virtualized Windows 2008R2 servers, which were configured under both Microsoft Hyper-V and VMware ESXi environments.

Installation was uneventful, which is a good thing, especially considering the complexity of Backup Exec and what it does behind the scenes. Perhaps the biggest enhancement offered by Backup Exec 2012 is its focus on Disaster Recovery (DR). Symantec claims BE2012 now offers a complete recovery environment for physical and virtual servers.

I found that Symantec was dead-on with their claims and that BE2012 was indeed able to quickly recover complete servers, both physical and virtual in a matter of minutes. I tested that capability by using the products administration console to schedule complete backups of all of the servers in my test environment. I had that schedule execute overnight and was presented with a comprehensive backup report the next day. Backups were speedy, however I did have some very basic configurations setup and only about 30Gbytes of files in my test environment.

I simulated a disaster by removing the hard drive in a Windows 2008R2 Server, which contained the VHDs (Virtual Hard Disks) for my Hyper-V based Windows Servers. I also replaced the server running VMware ESXi with a similar system to simulate a “bare iron” type of restore.

I was able to restore my servers without a hitch, just by following the instructions provided by Backup Exec’s DR module. I only had to do a minimal amount of pre-configuration work on the new hardware introduced and Backup Exec 2012 pretty much handled the rest.

One of the most powerful features offered is BE2012’s ability to convert a backed up server into a virtual machine. Here, I was able to select one of my backed up servers and then quickly transform it into a virtual server.

Basically, BE2012 reads in the backup and converts it to a mountable VHD while automatically abstracting the hardware layer, allowing a virtual server to brought up in an almost ready to use state.

That feature alone may be worth the price of entry, simply because many IT managers are looking for a way to migrate live physical servers into virtualized environments. I was also impressed with the product’s ability to perform P2V chores concurrent with a backup.

Here, during the backup process, the software creates both a traditional backup file and a VHD file concurrently. That can be a real time saver when recovering from a disaster simply because you can bring up the failed server as a virtual machine, while repairing the physical system.

In many cases, that can turn a recovery process that normally takes hours into something that can be accomplished in a few minutes. Although these actions may sound complex, BE2012 offers several restoration wizards that bring point and click simplicity to the recovery process.

Another major enhancement is the elimination of backup policies. At first blush, that may sound like a bad thing, however Symantec has replaced the antiquated policy based backup model with something the company calls “Backup Stages.” Here, instead of building complex, text-based backup execution policies, I was able to use a GUI based designer to setup backup stages, where I was able to click on backup related tasks to create an execution map. That proves to be a much easier way to build a backup (or restore) job, than having to deal with the manual process of policy creation.

Other notable changes include the concept of creating logical server groups. With logical groups, physical servers can be placed into multiple groups, allowing the creation of backup stages that execute only on those groups. That in turn creates the opportunity to define multiple backup events and have them executed on specific server groupings. Changing how servers are backed up becomes a simple matter of dragging the physical server icon into a different logical group.

While Backup Exec 2012 isn’t the only enterprise backup product on the market, it certainly offers a new concept in backup execution, which combines physical and virtual disaster recovery with ease of use and reliability. Other products, such as Acronis Backup and Recovery, CA ArcServe and EMC Networker normally focus on a single backup scenario (such as imaging or cloud backups) and require the purchase of additional modules to incorporate robust support for cross platform environments.

Read More

Tuesday, September 25, 2012

Remote Network Access: 10 Signs Its Time to Deploy Updated Control Software - Enterprise Networking - News & Reviews - eWeek.com

IT departments have been pushed to accomplish more with fewer resources for years. While enterprises deploy various solutions to improve efficiencies in numerous parts of the data center, they often overlook an important one that can bring significant benefits: remote network access. IT staffs don't get to choose when something goes wrong, and remote network access is a potent tool enabling them to fix trouble quickly. New-generation remote access and control software packages help achieve and maintain regulatory compliance, ratchet up security and enable support personnel to help employees and customers around the world. Netop, a longtime provider of remote IT system access software for enterprises that is used by 52 percent of Fortune 100 companies, has more than 30 years of experience in this sector and offers some background in this slideshow. Here are 10 signs that an enterprise might need to update remote access and control software.
Read More

Public Cloud Services Market Growing Faster Than Expected: Gartner - Cloud Computing - News & Reviews - eWeek.com

IT researcher Gartner reported Sept. 19 that the global appetite for public cloud services is ramping up faster than it and other analysts expected.

The subscription Web services market is now forecast to grow 19.6 percent in 2012 to total $109 billion worldwide, Gartner said in a report. Business process services (also known as business process as a service, or BPaaS) represent the largest segment, accounting for about 77 percent of the total market.

Past studies showed that cloud services were projected to grow 12 percent to 15 percent in 2012, with expected increases in 2013 through 2015.

As has been reported here in eWEEK previously, infrastructure as a service (IaaS) is the fastest-growing segment of the public cloud services market; that sector is expected to grow a whopping 45.4 percent in 2012.

"The cloud services market is clearly a high-growth sector within the overall IT marketplace," said Ed Anderson, research director at Gartner. "The key to taking advantage of this growth will be understanding the nuances of the opportunity within service segments and geographic regions, and then prioritizing investments in line with the opportunities."

BPaaS is the largest segment primarily because of the inclusion of cloud advertising as a subsegment, Gartner said. BPaaS is forecast to grow to $84.2 billion in 2012, up from $72 billion in 2011. In 2011, cloud advertising represented about 47 percent of the total public cloud services market, making it the biggest identifiable subsegment in the forecast, Gartner said.

Through 2016, cloud advertising will continue to account for about 47 percent of total public cloud services spending, Gartner said.

Software as a service (SaaS) is the next-largest segment and is forecast to grow to $14.4 billion in 2012, while IaaS is forecast to grow from $4.3 billion in 2011 to $6.2 billion in 2012. In 2010, the IaaS market was less than one-third the size of the SaaS market. By 2016, the IaaS market will grow to almost equal the size of the SaaS market.

Read More

Toshiba Offers 500GB Canvio Slim Portable Hard Drive

The Canvio Slim comes bundled with NTI Backup Now EZ software for data backup, as well as USB 3.0 technology for faster transfer speeds.

Electronics manufacturer Toshiba released what it claims is the world’s thinnest portable hard drive, the Canvio Slim, which measures just 9mm thick, 107 mm long and 75 mm wide, and boasts a storage capacity of 500GB.

The drive, which will be available at select retailers and on ToshibaDirect.com in October 2012, carries a suggested retail price of $114.99.

The device features a brushed-aluminum design and built-In USB 3.0 Interface to help improve file transfer performance for large media files and minimize the wait time for backup. The Canvio is also backward-compatible with USB 2.0 devices, the company noted. In addition to its slim form factor, the drive comes bundled with NTI Backup Now EZ software, which provides a system scan to recommend the best type of coverage; however, the preloaded software is compatible with Windows OS only.

Users can choose between backing up files to the cloud (a free 30-day trial of cloud backup is included), backing up files and folders to the Canvio Slim portable hard drive, backing up anything saved on the computer, or all three. Rounding out the package is a NTFS driver for Mac computers, which allows users to store and access files from a PC or Mac without reformatting. The drive, compatible with a variety of operating systems, is backed by a three-year limited warranty.

“As consumer electronics continue to get thinner, lighter and more portable, we recognized a huge demand to create a storage device that is in line with those trends,” Maciek Brzeski, vice president of product marketing and development of branded storage products for Toshiba America Information Systems’ digital products division, said in a press statement. “With the Canvio Slim, consumers can now easily stash their storage device right along with their Ultrabook, knowing that their data is always safe, even when they’re on the go.”

The Canvio Slim is the latest in a recent spate of hard-drive products released by Toshiba. Last month, the company expanded its enterprise solid-state-drive lineup with three models aimed at high-performance use cases. Its PX-Series is designed for servers that take the most pounding each day: boot, read-intensive, entry-level servers; entry-to-midrange application servers; and high-performance enterprise application servers.

Toshiba, which in 2012 is celebrating its 25th anniversary as the inventor of NAND flash memory, made an impressive comeback in the first quarter of 2012 in the NAND flash storage market, after two major revenue declines in 2011 and the devastating earthquake and tsunami in Japan. Defying an industry-wide contraction in revenue, Toshiba surged to double-digit growth, posting NAND sales revenue of $1.71 billion in the first quarter, up 19 percent from $1.43 billion in the fourth quarter of 2011, according to a report from IT research firm IHS.

Read More

Monday, September 24, 2012

Netgear Centria Router Offers Automated Backup - Enterprise Networking - News & Reviews - eWeek.com

Networking solutions specialist Netgear announced the release of its Centria all-in-one automatic backup, media server and high-speed WiFi router. Two versions, the WNDR4700 and the WNDR4720, which sports an internal 2-terabyte hard drive, are available.

The appliances are sold worldwide through retail stores and online for $229.99 (WNDR4700) and $349.99 (WNDR4720). Both models have two SuperSpeed USB 3.0 ports for adding even more storage, a company data sheet noted.

Centria uses ReadyShare Vault for Windows PC for backup, and supports Apple Time Machine for Mac to deliver wireless backup to the appliance's integrated hard drive or to a USB-connected external hard drive. An SD card reader allows single-click backup of media to the internal hard drive. The router component of Centria uses simultaneous dual-band (2.4GHz and 5GHz) technology with extended WiFi range. The 900M-bps router is designed to handle high-definition streaming of video and gaming content, the company noted. On the security side, Centria offers double firewall protection and denial-of-service (DoS) attack prevention.

The router also features a dashboard to monitor, control and repair the home network, and comes packaged with Netgear's Genie platform, which includes the MyMedia app to use smartphones-including Google Android and Apple iOS devices to find and play media remotely on any Digital Living Network Alliance (DLNA) device. The Centria router is also compatible with AirPrint, which lets users print on any USB printer from an iPad or iPhone. Rounding out the package are live parental controls, guest-network access capability, a broadband usage meter and one-click fix for common network issues.

"With increasing reliance on their computers to store important data and irreplaceable family photos and videos, it is imperative for consumers to back up their PCs and Macs so that if their computer crashes or is lost, then their important files will still be safe and accessible on Centria," Sandeep Harpalani, senior product line manager for wireless networking for Netgear, said in a prepared statement. "After the fast initial setup, Centria will automatically back up PCs and Macs every time they connect to the home network. This makes data backup a simple set-and-forget operation and simplifies digital life for consumers."

The company also announced a family of three NeoTV streaming players, among the first to support the HTML 5 standard. All three players--the NeoTV (NTV300), NeoTV PRO (NTV300S) and NeoTV MAX (NTV300SL)--offer Internet connectivity through built-in WiFi or a wired Ethernet connection, and each player includes a remote control with one-touch Quick Start buttons. The base NTV300 model is priced at $49.99, while the NTV300S Pro player is priced at $59.99, and the MAX (NTV300SL) is priced at $69.99.

Nathan Eddy is Associate Editor, Midmarket, at eWEEK.com. Before joining eWEEK.com, Nate was a writer with ChannelWeb and he served as an editor at FierceMarkets. He is a graduate of the Medill School of Journalism at Northwestern University.

Read More

HP Launches New Hybrid Tablet, Laptops, Desktops Ahead of Windows 8 - Desktops and Notebooks - News & Reviews - eWeek.com

By Chris Preimesberger  |  Posted 2012-09-21 Email Print this article Print

Hewlett-Packard plans to take full advantage of the Oct. 26 release of Microsoft's Windows 8. At media events around the country, the company has been showing off a number of new Windows 8-loaded PCs, including a new EliteTab tablet (which will be revealed next month) and a convertible laptop/tablet with a removable screen. The convertible Envy x2, which will be out in time for the holidays, looks like a small laptop that weighs in at about 3 pounds. But that's where the similarity ends. When you click a small magnetic latch just above the keyboard, it releases an 11.6-inch screen that turns the unit into an independent Windows 8 tablet. If you use the x2 as a laptop, you have a choice of using a traditional trackpad or a touch screen. In tablet or laptop mode, it can be used to watch video, run Windows tablet apps or regular Windows desktop software, such as Microsoft Office and PowerPoint. If HP wants to differentiate itself in the device world against all the usual-suspect incumbents, this is the device with the potential to do it. This slide show features highlights from HP's fall showcase, with more announcements due next month.  

Chris PreimesbergerChris Preimesberger was named Editor-in-Chief of Features & Analysis at eWEEK in November 2011. Previously he served eWEEK as Senior Writer, covering a range of IT sectors that include data center systems, cloud computing, storage, virtualization, green IT, e-discovery and IT governance. His blog, Storage Station, is considered a go-to information source. Chris won a national Folio Award for magazine writing in November 2011 for a cover story on Salesforce.com and CEO-founder Marc Benioff, and he has served as a judge for the SIIA Codie Awards since 2005. In previous IT journalism, Chris was a founding editor of both IT Manager's Journal and DevX.com and was managing editor of Software Development magazine. His diverse resume also includes: sportswriter for the Los Angeles Daily News, covering NCAA and NBA basketball, television critic for the Palo Alto Times Tribune, and Sports Information Director at Stanford University. He has served as a correspondent for The Associated Press, covering Stanford and NCAA tournament basketball, since 1983. He has covered a number of major events, including the 1984 Democratic National Convention, a Presidential press conference at the White House in 1993, the Emmy Awards (three times), two Rose Bowls, the Fiesta Bowl, several NCAA men's and women's basketball tournaments, a Formula One Grand Prix auto race, a heavyweight boxing championship bout (Ali vs. Spinks, 1978), and the 1985 Super Bowl. A 1975 graduate of Pepperdine University in Malibu, Calif., Chris has won more than a dozen regional and national awards for his work. He and his wife, Rebecca, have four children and reside in Redwood City, Calif.Follow on Twitter: editingwhiz
Read More

IBM Smarter Data Center Opens in Canada - IT Infrastructure - News & Reviews - eWeek.com

As part of its continued expansion into Canada, IBM, along with the governments of Canada and Ontario and the City of Barrie, announced the opening of a new smart data center, the IBM Canada Leadership Data Centre.

The Barrie, Ontario-based center is one of Canada's most advanced computing facilities and it will focus on advancements in energy-efficient data center management, business continuity, resiliency, security and disaster recovery services to help organizations efficiently manage growth while reducing costs and securely mitigating risk, IBM said.

The data center represents a $90 million investment from IBM and will establish 20 skilled jobs in Barrie, as a portion of the $175 million IBM invested and 145 jobs created through the April 2012 launch of the IBM Canada Research and Development Centre network.

In time, the new data center is expected to provide key infrastructure and personnel to help underpin ongoing research and development initiatives tied to this network, IBM said. The IBM funding is supported through the Government of Ontario's previous $15 million investment toward these initiatives.

"Canadian organizations are seeking more strategic ways to increase operational efficiencies and position themselves for sustainable growth," said John Lutz, president of IBM Canada, in a statement. "We continue to invest in smarter infrastructure within Canada because businesses can't afford downtime with today's economic pressures. This new facility provides a flexible foundation ingrained in best practices so we can deliver essential services to help organizations and partners better manage data, reduce operating costs, improve productivity and gain competitive advantage."

"The progress of this unprecedented research partnership is wonderful to see," said Brad Duguid, Minister of Economic Development and Innovation for the Government of Ontario, in a statement. "This project will help to improve how we deal with challenges in health care, infrastructure and cities, energy and water conservation while making use of Ontario's greatest resource-our people."

Fifty percent of Canadian organizations recently surveyed reported that providing sufficient data center space and ensuring the availability required to meet customer service demands are among their top challenges, according to IBM. This is particularly important for high availability industries like financial services, government and retail. Yet organizations typically spend about 70 percent of their IT budgets simply maintaining existing environments. In parallel, IDC estimates the amount of information managed by enterprises will grow 50 times over the next decade with the number of associated servers installed by organizations increasing by 49 percent in the next two years.

In an IBM survey of 300 IT leaders, the 21 percent of organizations that ran efficient data centers were able to spend 50 percent more on new projects and innovation to make their organizations more successful, IBM said.

IBM's new, modular data center will provide synchronous replication of data with another center that is within 100 kilometers. This means organizations running mission-critical applications can locate their primary IT operations in one center and establish a data recovery center far enough away to reduce the risk of a geographic disaster impacting both sites, but close enough to ensure operational or customer data is always available.

The new facility will provide 25,000 square feet of initial capacity with the ability to grow to 100,000 square feet. IBM has designed and delivered more than 1,000 similar modular data centers for customers worldwide. Modular data center design uses small increments of standardized components to match business requirements with IT requirements and only add data center capacity when needed. Modular centers can be expanded in half the time of a traditional data center to easily accommodate growing demand. They help clients save up to 30 percent per year in energy costs compared with traditional centers, IBM said.

This newest addition to IBM's existing network of 17 data centers within Canada bolsters the company's $75 million investment in Markham and Montreal-based data centers during 2011.

Darryl K. Taft covers the development tools and developer-related issues beat from his office in Baltimore. He has more than 10 years of experience in the business and is always looking for the next scoop. Taft is a member of the Association for Computing Machinery (ACM) and was named 'one of the most active middleware reporters in the world' by The Middleware Co. He also has his own card in the 'Who's Who in Enterprise Java' deck.

Read More

Tuesday, September 18, 2012

1010Data Launches Cloud-Based Community for Big Data Management

1010data may not be a household name in the IT world, but that's not stopping it from starting up a new industry community in which enterprises can share information to analyze, share and monetize big data.

The cloud platform provider on Sept. 18 announced the formation of the 1010data Analytical Dataspace, which it claims to be the first cloud-based community in which organizations can share, analyze and thus monetize big data workloads.

The Analytical Dataspace is a forum where organizations, regardless of size or vertical market, can analyze large datasets in near-real time, combine their own data with other organizations' data to gain new analytical insights and start new revenue streams from selling their data.

1010data, a New York City-based company, is in this because it has developed a hosted, on-demand big data analytics platform that it claims can process big data and deliver results quickly and inexpensively.

The Analytical Dataspace comprises three main offerings, according to CEO Sandy Steier:

  • Analytics Platform as a Service: Organizations can utilize 1010data's cloud-based analytics platform to run sophisticated queries on massive datasets. They can also use the platform to build industry-specific applications and can even use it as their enterprise data warehouse for operational reporting.
  • Data Mashups: The platform enables customers to use data from other organizations and combine it with their own to glean analytical insights previously unattainable. For example, a retailer can combine its POS data with other licensed datasets distributed on 1010data, such as weather, real estate or econometric datasets, and build models to respond to changes in demand driven by weather, economic conditions or shifts in buyer behavior from brick and mortar to online shopping.
  • Data Monetization: Because the Analytical Dataspace is cloud-based, organizations can quickly and easily sell their data (or analysis) to interested parties. This helps organizations create additional revenue opportunities from the data they generate.

"The Analytical Dataspace gives enterprises a single destination for their big data analytics needs, allowing them to easily capitalize on the massive data they or others are creating," Steier said.

"We believe that by creating a community for data providers and users, analysts and business decision makers, they can have the world's data at their fingertips and leverage it to identify business opportunities that have never been possible before."




Read More

Mobile Computing, Virtualization Increasing Data Center Complexity: Symantec

Emerging technologies such as cloud computing, virtualization and mobile devices are rapidly increasing the complexity of managing data centers, according to the results of security specialist Symantec’s 2012 State of the Data Center Survey, which found 79 percent of respondents are dealing with this issue.

The results suggest organizations should consider taking steps to manage organizational resources to better manage operational costs and control information growth. Forty-four percent of respondents said mobile technology is the leading driver of data center complexity.

Sixty-five percent said the number of business-critical applications is increasing or increasing greatly, while server virtualization (43 percent), and public cloud (41 percent) were cited alongside mobile computing as major complicating factors. Nearly half of the organizations citing rising costs as an effect of complexity, as well as security breaches (35 percent), downtime (35 percent), reduced agility (39 percent) and longer lead times for storage migration (39 percent).

“As today’s businesses generate more information and introduce new technologies into the data center, these changes can either act as a sail to catch the wind and accelerate growth, or an anchor holding organizations back,” Brian Dye, vice president of Symantec’s information intelligence group, said in a prepared statement. “The difference is up to organizations, which can meet the challenges head on by implementing controls such as standardization or establishing an information governance strategy to keep information from becoming a liability.”

The results, based on responses from 2,453 IT professionals at organizations in 34 countries, also suggest organizations are keen to implement measures to reduce complexity, including training, standardization, centralization, virtualization, and increased budgetsâ€"63 percent said increasing their budget to be somewhat or extremely important to dealing with data center complexity.

The overriding initiative, however, seems to concern information governance strategies that help organizations proactively classify, retain and discover information in order to reduce information risk, reduce the cost of managing information.

The survey found 90 percent of respondents are considering information governance policies or are currently implementing them. Security was the main driver of information governance, cited by 75 percent of survey respondents, followed by the availability of new technologies that make information governance easier (69 percent), increased data center complexity (65 percent) and data growth (65 percent).

Improving security was also the top goal of organizations implementing information governance programs, cited by three-quarters of survey respondents, followed by ease of finding the right information in a timely manner (70 percent), reduced costs of information management (69 percent) and storage (68 percent), reduced legal and compliance risks. Fifty-nine percent of organizations surveyed said moving to the cloud was one of the goals of implementing information governance.




Read More

Monday, September 17, 2012

How Facebook Is Handling All That Really Big Data

How Facebook Is Handling All That Really Big Data
( Page 1 of 2 )

MENLO PARK, Calif. -- Facebook is much like the Starship Enterprise in that it likes to go to places no company has gone before.

This is probably because not too many IT companies, especially young ones, have had to serve upwards of 950 million registered users -- including a high percentage on a real-time basis -- daily. Not many have to sell advertising to about 1 million customers or have dozens of new products in the works, all at the same exact time.

Facebook, which has a clear do-it-yourself IT approach, also designs its own servers and networking. It designs and builds its own data centers. Its staff writes most of its own applications and creates virtually all of its own middleware. Everything about its operational IT unites it in one extremely large system that is used by internal and external folks alike.

For example, Facebook's human resources group, the accounting office, Mark Zuckerberg on email and even you at your laptop checking your status are all using exactly the same gigantic, amorphous data center system that circles the globe in its power and scope.

Everything Facebook Does Involves Big Data

"So just about everything we do turns out to be a Big Data problem," said Jay Parikh, Vice President of Infrastructure Engineering at Facebook, who spoke recently to a small group of journalists at the company headquarters. "This affects every layer of our staff. We've talked with some of you about the servers, storage, networking and the data center, as well as all the software, the operations, the visibility, the tools -- it all comes together in this one application that we have to provide to all our users."

Big data simply is about having insight and using it to make impact on your business, Parikh said.

"It's really very simplistic. If you aren't taking advantage of the data you are collecting and being kept in your business, then you just have a pile of a lot of data," Parikh said. "We are getting more and more interested in doing things with the data we are collecting."

Facebook doesn't always know what it wants to do with the user lists, Web statistics, geographic information, photos, stories, messages, Web links, videos and everything else that the company collects, Parikh said. "But we want to collect everything, we want to instrument everything: cameras, when that door opens and closes, the temperature in this room, who walks in and out the lobby.

"We want to know who visits the site, what activities they do, where they do on the site. So everything is interesting to us," he said.


Read More

Cloud Storage Specialist Box Launches Accelerator Network

With market demand growing for cloud-based storage services, Web-based data storage specialist Box on Sept. 17 announced the launch of Accelerator, a global data transfer network designed to give customers up to a 10 times boost in upload speeds.

The platform is a combination of new infrastructure in nine locations throughout the world and network intelligence software. The company said it has Accelerator locations in the Northwest, Midwest and on the East Coast in the United States, as well as the new Box office in London, among other places.

Box is one of several options organizations have for uploading and storing data in the cloud, alongside Google Drive, Dropbox and SkyDrive. While security is a rising concern for many, upload speeds are a major competitive factor, which led Box to hire information and analytics specialist Neustar to conduct an analysis of the upload performance for multiple cloud storage providers for a period of three days earlier this month.

The analysis covered the performance of uploading a 25MB file to four cloud storage services: Box, Dropbox, Google Drive and SkyDrive. Box had the lowest average upload time of 15.7 seconds, which was approximately 2.7 times faster than the closest competitor, Google Drive.

“With Box Accelerator, we’re able to give our customers the fastest way to get their content to the cloud. What can 10X faster uploads mean to a business? Look at it this way: instead of waiting one hour for a several-hundred MB file to post, 10X faster means you get it done in under 6 minutes,” Grant Shirk, Box’s senior enterprise product marketing manager, wrote in a company blog post. “Our mission at Box is to fundamentally change the way people workâ€"we’re constantly seeking new ways to improve the efficiency of everyday tasks. A big part of this is making sure every experience is clean, simple and fast for our users around the world.”

In July, the company announced that it had landed a huge new $125 million investment, which includes $100 million from global growth investor General Atlantic. The investments will fund continued support of Box's growing enterprise customer base, global expansion, and product research and development. Box now counts more than 7 million individual and 120,000 business customers, with about 250,000 new users joining each month.

In an effort to further reach out to enterprises, Box unveiled a slew of content management tools in May, as well as an enterprise licensing agreement for new and existing customers deploying Box to their organization. In response to the growing bring-your-own-device (BYOD) trend, Box also released mobile device security settings for Google Android devices, which give IT admins the ability to apply passcode locks and enhanced permissions for offline file access for a variety of Box applications.





Read More

Friday, September 14, 2012

Acronis Acquires File-Sharing Company GroupLogic

Acronis, which sells disaster recovery and data protection technology for physical, virtual and cloud environments, has acquired GroupLogic, whose software ensures secure enterprise file access, sharing and syncing, in what an analyst says is a consolidation move already affecting some companies in both areas.

The synergy between the two companies is that Acronis offers disaster recovery and data protection, while GroupLogic specializes in file sharing in expanding mobile environments and on collaboration platforms where end users are widely distributed but create and share content over the network.

The staffs of both companies will merge, with Acronis President and CEO Alex Pinchev serving as CEO of the combined companies and GroupLogic CEO Chris Broderick becoming senior vice president of mobility solutions for Acronis. Terms of the deal between the two privately held companies were not disclosed.

GroupLogic, the smaller of the two firms, benefits from the acquisition by gaining access to Acronis’s 175,000 global customers and 25,000 channel partners, said Dave Simpson, senior storage analyst with 451 Research, who added that GlobalLogic’s customer base is primarily in the United States.

Conversely, Acronis gains from the acquisition with the ability to offer the GlobalLogic file sharing and sync solution that many of its competitors in the backup and recovery space don’t have, Simpson said.

“That is a market that it has been very hard for everybody, including Acronis, to differentiate in,” he said. “This gives them something that most of their traditional competitors do not have.”

Another differentiator for Acronis is GlobalLogic’s support for the integration of Apple devices into enterprise server environments.

But Acronis’ advantage could be short-lived, Simpson said, if its competitors pursue the same strategy.

Acronis faces some major competition in the backup and recovery space, including from Symantec, CA Technologies and Commvault, Simpson said. There is also a subset of vendors who also do backup and recovery but only of virtual machine data, including Veeam and, most recently, Dell. Dell is about to close on its $2.4 billion acquisition of Quest Software following its February acquisition of AppAssure.

On the file sharing and sync part of the business, the most well known provider is Dropbox, but other players include Egnyte, FileTrek and ownCloud, the latter an open-source software solution for file sharing primarily within a company.

“This is the beginning of what I think will be a lot of consolidation in the file sync and sharing space,” said Simpson.

As in many technology areas, when a new hot trend emergesâ€"in this case, use of the cloud for collaboration and file sharingâ€"a number of new players emerge. However, over time a few become dominant and begin swallowing up their smaller competitors.




Read More

Tuesday, September 11, 2012

Global Storage Software Market Goes Flat in Q2 2012

Curiously, the data storage hardware market continues its slow but steady income improvement each quarter, but the corresponding software sector seems to have leveled off, according to a major industry researcher.

Figures released Sept. 11 from IDC's Worldwide Storage Software QView show that the worldwide storage software market closed the second quarter 2012 higher than a year ago, but for all intents and purposes was flat.

Global revenue during the second calendar quarter increased by a mere 0.9 percent year-over-year to $3.36 billion. This was the second consecutive quarter of reduced year-over-year growth for the market and a performance level lower than any time since the fourth quarter of 2009, IDC said.

The calendar years 2008 and 2009 were central in the largest global recession in more than 70 years, and the data storage software business wasn't immune to the downturn.

EMC, IBM and Symantec remained the top three storage software suppliers with 26.4 percent, 14.7 percent, and 14.6 percent of the market, respectively. Demonstrating the largest year-over-year growth during the quarter were CommVault, with a 21.5 percent increase, and EMC, with a 7.4 percent gain.

Last April, CommVault announced a new partnership with Microsoft to deliver its Simpana Data Management in the Windows Azure cloud, a factor that certainly helped spur the company's growth in the quarter.

"The second-quarter storage software results were mixed when viewed by supplier or functional market," said Eric Sheppard, research director in storage software at IDC. "Indeed, five of the top eight suppliers experienced revenue growth, but this was offset by declines within a few of the market's larger suppliers.

"Most functional markets showed increased investments, but the increases were far smaller than the market had been experiencing over the past two years. These generally lower results can be partially attributed to suppressed economic growth in Europe, reduced government/education investments and transitions specific to a few large suppliers."

Data protection and recovery and archiving software were once again the two fastest growing market segments with 2.4 percent and 2.2 percent year-over-year growth rates, respectively, or $1.16 billion and $404 million in total revenue, IDC said.




Read More

Google Drive Storage Updated for Android, iOS

Five months after unveiling its Google Drive cloud storage services, Google is now adding several useful features that aim to make Drive even easier and more flexible for users on both Apple iOS and Android mobile devices.

In a Sept. 10 post on the Google Enterprise Blog, Anil Sabharwal, senior product manager for the Google Drive Team, wrote that the improvements come as more users are choosing to get more things done in the cloud.

Apple device users can now for the first time "edit Google documents, just as you can with the Android app," wrote Sabharwal. "From your iPhone or iPad, you can create a new document, edit an existing one or format text. And just like on your computer, you’ll be able to see other people’s edits instantly as they’re made."

For iOS users, other new improvements include:

  • The ability to view Google presentations on your iPhone or iPad, including speaker notes, full-screen mode and the ability to swipe between slides, according to Sabharwal. "You can also create new folders, move files into folders and upload stuff (like photos and videos) from your device directly in the Drive app."

Android users will get improvements such as:

  • The ability to add comments, reply to existing comments and view tables in your Google documents. "And you’ll have the same new abilities to view presentations and organize your stuff as your friends with iPhones do," wrote Sabharwal.

Other planned new features are also in the works for the future, he wrote, including native editing and real-time collaboration for Google spreadsheets.

Drive can be downloaded from Apple's App Store for iPhone, iPad or iPod devices, and from the Google Play Store for Android phones or tablets.

The Google Drive cloud service was launched April 24 after about six years of planning and talks about its intentions to introduce a cloud storage service. The Drive offering joined a busy cloud storage marketplace that was already packed with competitors such as Box and Dropbox.

Google Drive offers users up to 5GB of storage for free and is integrated with Google's core services, such as Google Docs, where users can do their work and then seamlessly store it in their part of the cloud for safekeeping and easy access.

In June, Google added Apple iOS support for Drive, which wasn't originally available when the service debuted. Drive now supports iOS, Windows and Google's Chrome OS operating systems.

Google Drive also includes support for a wide variety of file formats, even if the applications aren't installed on the user's device. That allows users to open the files for viewing as needed.

Drive proved to be very popular among users just after its launch. Sign-ups for the service grew to a "very strong start, with probably about 35 million to 40 million sign-ups in 15 days," according to an earlier eWEEK report.

Google provides free storage for up to 5GB in Google Drive. Extra storage is priced as follows: 25GB: $2.49/month; 100GB: $4.99/month; 200GB: $9.99/month; 1TB: $49.99/month; 16TB:$799.99/month. Other increments below 16TB are available.




Read More

Riverbed Expands Whitewater Cloud Storage Portfolio

Riverbed Technology is expanding its Whitewater cloud storage portfolio to enable enterprises to handle larger backup data workloads with greater scalability and management capabilities.

Riverbed on Sept. 10 unveiled its Whitewater 3010 cloud storage gateway appliance, which offers four times the disk storage capacityâ€"both onâ€"premise and in the cloudâ€"and twice the memory of the company’s previous largest model, the Whitewater 2010, as well as the Whitewater Operating System (WWOS) 2.0, the latest version of its OS that will offer greater scalability and simplified management, according to Ray Villeneuve, general manager of Whitewater at Riverbed.

WWOS 2.0 has been in beta for several months, getting a strong embrace from businesses, Villeneuve told eWEEK. The software is generally available now.

The new appliance and upgraded operating system come as enterprise adoption of cloud storage is growing, with analysts at IDC expecting the basic public cloud storage space to grow 33.6 percent a year, to about $9.2 billion by 2015. Backup is a top usage model for public cloud storage, according to IDC, and gateway appliances are being used to enable enterprises to securely and efficiently leverage public clouds for data backup.

The move to using public clouds for backup is in the early stages, Villeneuve said, adding that the trends in enterprisesâ€"from securing data and driving costs down to improving service-level agreements (SLAs) to embracing the cloudâ€"is shifting toward the idea of cloud storage backup “in a big, big way.”

Enterprises for years have relied on tape for back up andâ€"more recently, replicated disk-based storageâ€"but with the rapid growth in the amount of data being generated, these options are quickly becoming complex, expensive and risky for businesses worried about SLAs, he said. Cloud storage is becoming increasingly popular, particularly as cloud storage prices are dropping and scale and flexibility are available.

Riverbed officials said Whitewater appliances can help businesses reduce data backup costs by as much as 30 to 50 percent, and the systems can be deployed in less than an hour.

The Whitewater 3010 is the fifth cloud storage gateway appliance offered by Riverbed, and its biggest yet. The Whitewater 2010 offers 8 terabytes of capacity, 32 gigabytes of internal memory and a maximum ingest rate of 1.25TB per hour. The 3010 brings with its 32TB of capacity, 64GBs of memory and an ingest rate of up to 1.5TB per hour.

With WWOS 2.0, Riverbed officials said they are giving users simplified and efficient management capabilities through a host of new features, including greater support for large data backup blocks. When combined with the Whitewater 3010 appliance, the new operating system enables enterprises to support up to 32TB of deduplicated local storage and 160TB of deduplicated cloud storage, increasing capacity of 1.6 petabytes to 4.8 PB.

The OS also offers the Management Dashboard for real-time monitoring and reporting of cloud backup processes, remote and central management of one or more Whitewater appliances for everything from rebooting and shutting down systems to monitoring them, and proactive alerts that give IT administrators a heads-up when thresholds are close to being reached.

WWOS 2.0 also offers integration with Windows Active Directory for more streamlined operations and better security, according to company officials.

Riverbed’s rollout of its new Whitewater cloud storage gateway appliance and upgraded operating system comes two weeks after the company expanded its partnership with virtualization technology vendor VMware. At the VMworld 2012 show, Riverbed officials announced that the two companies were unveiling a performance management offering for software-defined networks (SDNs) through support in Riverbed’s Cascade product family for VXLAN technology.

In addition, Riverbed officials said their Steelhead Cloud Edition will be integrated with VMware’s vCloud Director solutions, enabling businesses to more easily enable WAN optimization in virtual data centers.

Riverbed and VMware also addressed virtual desktop infrastructures with a solution that combines Riverbed’s Granite edge virtual server infrastructure with VMware’s View technology. The two companies also announced the integration of Riverbed’s Stingray Traffic Managerâ€"including the software and virtual application delivery controllerâ€"with VMware’s vFabric Application Director, a hybrid cloud application provisioning solution.

Riverbed Chief Marketing Officer David Green said it was important for Riverbed to push its vision for virtualization and cloud computing at VMware’s conference.

“In our mind, VMworld is where the industry comes together to talk about virtualization and to talk about the cloud,” Green told eWEEK.




Read More

Western Digital Debuts 2.5-inch Hybrid Hard Drive

Western Digital subsidiary and storage specialist WD unveiled what it claims to be the world’s thinnest 2.5-inch hybrid hard drive, designed to provide high-capacity storage while featuring instant-on and application performance similar to today's client solid state drives (SSDs). The company noted it is sampling the 5mm-thin hard disk drive and will showcase the technology at an investor’s conference later this week.

The technology pairs multi-level cell (MLC) NAND flash storage for SSD-like data throughput and instant-on responsiveness with magnetic disks for high-capacity storage. The drives also use data tiering, whereby data accessed most frequently (known as “hot” data) is managed using NAND flash for fast response times, “cold” data-- data accessed less often--resides on the robust magnetic disks.

"Mobile devices are becoming smaller, thinner, lighter and more responsive," Matt Rutledge, vice president of client storage solutions at WD, said in a prepared statement. "Working with our technology partners, WD has developed new 5 mm hard drives that enable high capacity storage along with excellent performance and superior economics to allow our customers to expand their thin offerings."

The tiered design in WD hybrid hard drives, which works in conjunction with the PC operating system, also provides users with a data redundancy, as the magnetic disk backs up all files residing in the NAND, protecting the user from wear and preserving NAND for the more hot data handling. The single-unit design also provides benefits such as lower power consumption, greater operating shock tolerance, and data protection. The company counts computer manufacturers Acer and Asus as partners supporting the drive.

"Acer is partnering with WD to bring advanced notebook performance and capacity in the smallest form factor," David Lee, associate vice president of mobile computing product business unit at Acer, said in a press statement. "It's a part of our ongoing commitment to present leading technology that ultimately improves the total user experience of our customers."

The release comes as WD wraps up a busy summer where the company launched or updated a slew of storage solutions, starting with the July debut of the Red line of network-attached storage (NAS) hard drives for small office/home office applications. In August, the company released the My Book VelociRaptor Duo dual-drive storage system, supported by the Mac OS X operating system and aimed at creative professionals, and added the USB 3.0 interface to its line of My Passport for Mac portable hard drives, increasing capacity for Mac computer users up to 2TB.

Earlier this month, WD announced the latest version of My Passport line of portable hard drives for PC and Mac, My Passport Edge, with drives featuring 500GB of storage and a USB 3.0 interface for improved read/write speeds, and a variety of security features to protect the drive's content from unauthorized access. Security features allow users to set password protection and hardware encryption and protect files from unauthorized use or access, while the company’s SmartWare automatic continuous backup software helps protect data using minimal PC resources.




Read More

Monday, September 10, 2012

EMC Still Dominating Healthy Global Storage Sector

The global disk-based storage market, rebounding from a big hiccup a year ago following Thailand floods that knocked out several key hard-disk drive manufacturing facilities, appears to be back on a healthy revenue track.

Industry researcher Gartner reported Sept. 7 that worldwide external controller-based disk storage vendor revenue amounted to $5.5 billion in the second quarter of 2012, a non-spectacular but nonetheless solid 6.7 percent increase from revenue of $5.1 billion in the second quarter a year ago.

The second quarter of 2012 was the 11th consecutive quarter of revenue growth in the sector, Gartner said, but it fell short of the researcher's projection of a 7.9 percent year-over-year increase.

EMC, Oracle, Fujitsu Make Gains

EMC, Fujitsu and Oracle produced year-over-year revenue gains that outgrew the industry average.

EMC used its optimized-to-fit product strategy and new-gen VMAX storage arrays to increase its leading external controller-based disk-storage platform market share to 33.3 percent at $1.8 billion in revenue. Second-place IBM has 13.8 percent of the global market and third-place NetApp is at 11.1 percent.

Hewlett-Packard (9.4 percent), Hitachi Data Systems (8.7 percent) and Dell (7.3 percent) follow on the list.

Due to increased sales of its ZFS Storage Appliance, Oracle increased its year-over-year market share for the first time since it acquired Sun Microsystems in January 2010. By bringing in $105.5 million in the second quarter of 2012, Oracle is in seventh place, with 1.9 percent of the world market.

Fujitsu, which is benefiting from a purchasing rebound in Japan, is also improving its market share in Europe, where its Fujitsu Technology Solutions subsidiary produced improved results selling the Fujitsu Eternus-branded storage products. With $83 million in second-quarter revenue, Fujitsu is in eighth place with 1.5 percent of the world market.

External Pressures in Europe a Factor

"Although the hard-disk drive supply issues created by the October 2011 Thailand flood was no longer an impediment on meeting user demand, the economy in certain regions had a debilitating impact on vendor revenue in the second quarter of 2012," said Gartner Vice President Roger Cox.

Cox said that the sluggish macroeconomic climate in Europe has affected sales there.

"In particular, the dour EMEA [Europe, Middle East and Africa] economy dampened year-over-year vendor revenue growth to just 2.6 percent against a forecast of 7.4 percent, while the slowing Asia-Pacific economy held year-over-year vendor revenue growth to 9 percent, 7.1 percentage points lower than Gartner's forecast," Cox said. "Only the North American region and Japan met or exceeded our expectations in the second quarter of 2012."




Read More

EMC, IBM, NetApp Lead Disk Storage Systems Market: IDC

The outlook for the worldwide disk storage systems market continues to look promising after external disk storage systems factories posted nearly $6 billion in revenue in the second quarter of 2012, up 6.5 percent from a year earlier, according to IT research firm IDC's Worldwide Quarterly Disk Storage Systems Tracker.

EMC maintained its lead in the external disk storage systems market with 30.4 percent revenue share in the second quarter, followed by IBM and NetApp in a statistical tie for second with 12.9 percent and 12.1 percent market share, respectively.

HP captured the No. 4 position with a 10.7 percent market share, while Hitachi (with 8.1 percent share) and Dell (with 7.8 percent) rounded out the top five in a statistical tie for fifth place. (IDC declares a statistical tie in the worldwide disk storage market when there is less than 1 percentage point difference in the factory revenue of two or more vendors.)

The total disk storage systems market posted just under $8.1 billion in revenue in the second quarter, representing 8 percent growth from the prior year's second quarter. Total worldwide disk storage systems capacity shipped reached 6,667 petabytes, growing nearly a quarter (24.8 percent) year-over-year.

"The external disk storage system market continues to grow on a stable trajectory with factory revenue approaching $6 billion for the first time in the second quarter," Liz Conner, senior research analyst with IDC’s storage systems division, said in a prepared statement. "Despite concerns regarding the global and regional economies, end users continue to invest in storage infrastructures. Helping to drive the worldwide market in Q2 was the double-digit growth in the emerging regions and strong demand for midrange storage."

In the total open networked storage market, EMC continues to maintain its lead with 33.7 percent revenue share, followed by NetApp with a 14 percent revenue share. With just under $5.2 billion in revenue, the total open networked disk storage market (network-attached storage combined with Open / iSCSI storage-area network) experienced growth of 6.5 percent year-over-year in the second quarter. EMC was also the leading vendor in the Open SAN market (which grew 8 percent year-over-year) with 29.4 percent revenue share, followed by IBM in second place (15.2 percent share) and HP in third (12.8 percent share).

"The midrange storage class [average selling prices in the $25K to $249.99K range] grew faster than any other class, at 12.2 percent year-over-year growth, in the second quarter," Amita Potnis, senior research analyst in IDC’s storage systems division, said in a press statement. "At 48.2 percent share of total worldwide external revenue in 2Q12, IDC expects that the midrange storage class will soon cross the 50 percent revenue share milestone. IDC believes that this class will be the fastest-growing class as vendors bring to market modular systems offering enterprise-level functionality such as compression, tiering, and data de-duplication at greater affordability."




Read More

Western Digital Updates My Passport Edge Portable Hard Drives

Western Digital Corp. subsidiary WD announced the latest version of My Passport line of portable hard drives for PC and Mac with the debut of My Passport Edge. The drives feature 500GB of storage and a USB 3.0 interface for improved read/write speeds, and a variety of security features to protect the drive's content from unauthorized access.

The drives come with a three-year limited warranty and are available on the WD store and at select retailers. My Passport Edge retails for $109.99 and My Passport Edge for Mac lists for $119.00.

For Mac owners, the latest edition of My Passport Edge for Mac complements Apple’s MacBook and MacBook Air computer designs with an all-aluminum exterior to protect the drive and its contents. The Mac edition is also compatible with Apple Time Machine backup utility for seamless operation out-of-the-box with a user’s Mac computer. The PC version of the hard drive incorporates WD SmartWare continuous and automatic backup software to create a copy of users' computer content to ensure personal digital files are backed up and protected in the event of computer loss or theft. My Passport Edge was also re-engineered with a new design for a premium finish.

On the security side, users can deploy WD Security to set password protection and hardware encryption and protect files from unauthorized use or access, while the company’s SmartWare automatic continuous backup software helps protect data using minimal PC resources. Whenever a user adds or changes a file, it is backed up, while the drive itself is built for durability, shock tolerance and long-term reliability. With WD Drive Utilities, users can register the drive, set drive timers, run diagnostics and more.

"WD's new My Passport Edge portable drives for Mac and PC offer mobile users a safe, compact, attractive way to conveniently work or play anywhere life or business may take them," Jim Welsh, executive vice president and general manager of WD's branded products and consumer electronics groups, said in a prepared statement. "We know that people who bring personal and business content along with them on their travels are concerned with portability and with maintaining privacy while also protecting their data. We design all My Passport products with a small footprint and with security and protection in mind."

The company’s latest announcement follows a busy first half of the year, when WD rolled out a slew of updates and new releases to its portable storage drives. In August, WD unveiled its My Book VelociRaptor Duo dual-drive storage system, supported by the Mac OS X operating system and aimed at creative professionals. The drive also provides a just a bunch of disks (JBOD) option for users running a Windows operating system on a Mac, and sports two 1TB 10,000rpm WD VelociRaptor drives along with two Thunderbolt ports, and comes with a Thunderbolt cable in the package.




Read More

HP Picks Microsoft Exec to Run Analytics Division Autonomy

A year after it shelled out $11 billion for a virtually unknown data storage and management company from the U.K. called Autonomy, Hewlett-Packard believes it has found the right person to manage it.

Former Microsoft executive Robert Youngjohns on Sept. 17 will take over as senior vice president and general manager of the Autonomy/Information Management business unit -- a division upon which HP is counting to become a substantial profit leader in years to come.

Youngjohns, who will report directly to the executive vice president of HP Software, George Kadifa, was president of Microsoft's North American region. Previously, he was president and CEO of Callidus Software following managerial jobs at Sun Microsystems and IBM.

Youngjohns replaces former Autonomy CEO and founder Mike Lynch, whom HP let go last May at the same time it announced a restructuring that would cut about 27,000 jobs globally.

Why Co-Founder Was Let Go

At the time of Lynch's release, HP CFO Cathy Lesjack said that Autonomy's "license revenue was disappointing, sales execution was a challenge and big deals were taking longer to close." CEO Meg Whitman said that Autonomy's problems were "not the product ... it's not the market ... it's not the competition. This is classic entrepreneurial company scaling challenges -- it's a whole different ball game."

In the face of clear-cut industry trends toward more IT spending for refreshed data center hardware and software to process larger and larger business workloads, Youngjohns will be charged with leading Autonomy to take HP's products and services in the information management and analytics software sector to the next level.

Gartner, IDC and other analytics firms have estimated that more than 80 percent of the world's data is not kept inside databases. Thus, HP bought Autonomy with the idea that it would help it become the world's leading manager and analyst of unstructured data stores.

Autonomy, which evolved out of a project at Cambridge University under Lynch 15 years ago, has developed software that is able to sift through huge data stores and categorize patterns found in unstructured and semi-structured information. It effectively structurizes non-databased data so that it can be used for business purposes.

Law firms, for example, can use Autonomy to filter quickly through e-mails or other data for legal evidence in court cases. Enterprises can use it to determine internal fraud or perform research for compliance-related purposes.

This is all very different from analyzing data that is already housed within the walls of columns and rows. Plenty of companies already do that; not as many do what Autonomy does.

Cloud Deployment a Key Factor

The kicker is that Autonomy is engineered to perform these services through cloud-based services. Since HP already has the cloud infrastructure ready and waiting for these capabilities, the original idea for a good fit between the two companies remains an interesting business proposition.

In June 2012, Autonomy announced a series of cloud-based packages designed to help organizations generate a greater return on their big data initiatives. Based on the HP Converged Cloud and the Autonomy Intelligent Data Operating Layer (IDOL) 10, these products include new capabilities for processing Hadoop data, as well as a new Clickstream analytics solution.

The solutions enable businesses to discover new trends, opportunities and risks, and accelerate revenue growth by understanding and acting on Web Clickstream, sentiment and transactional data.

eWEEK Senior Editor Darryl K. Taft contributed to this story.

Chris Preimesberger is Editor of Features and Analysis for eWEEK. Twitter: @editingwhiz




Read More

Data Migration Project Planning: 10 Best Practices to Implement

By Chris Preimesberger on 2012-09-05

Data integration involves combining and reconciling data residing in different storage areas, which could be on site or in the cloud, and giving users a unified view of this data. This process becomes a significant undertaking in a variety of situations, which include both commercial (when two similar companies need to merge their databases) and scientific (combining research results from different bioinformatics repositories, for example) domains. Data integration issues are increasing in frequency as the volume, the number of file formats and the need to share existing data explodes. It has become the focus of extensive theoretical work and numerous open problems remain unsolved. In management circles, people frequently refer to data integration as Enterprise Information Integration. Sometimes, it's difficult to know where to start a data migration project, because in any enterprise, there are many data stores and agendas. To try to shed some light on the problems, eWEEK and integration expert Arvind Singh, CEO of data management consultancy Utopia Inc., have put together some best practices to think about before starting an enterprise data migration project.

Determine the Purpose of Your Data

Analyze the information environment, identifying where and how the data will be leveraged—and who will actually use the data—while thoroughly assessing the information environment. Determine how data could be used differently tomorrow, in analytics, for example.

Take an 'As-is' Assessment Before Moving Data

Data is dynamic—changing all the time—and it is tied directly to business processes and uses. Establish data standards and define business rules for migration and ongoing data use with owners and subject matter experts.

Account for Data Quality, Especially in Legacy Systems

Migrating involves more than moving data. Perform a thorough quality assessment to ensure standardization that supports new uses and users today and tomorrow. This should include deduplication, removal of other non-relevant files and possibly a master data management-type process.

Validate and Redefine Business Rules

Data must comply, and be compatible with, current business and validation rules. Define rules for one-time data conversion and migration, while designing them for adaptability to future regulatory and policy requirements.

Ensure that Governance Rules are Established Early and Define Who Is Responsible for the Data

After determining who owns and has final say-so over information, establish strategic and operational data "stewards" aligned to "C-level" sponsors for continuing guidance on scope, direction and support.

Take Responsibility for Your Data Migration

The company, not only the systems integrator, must live with the results of data migration. Based on aptitude and attitude, find the right people to manage processes and technology correctly.

Don't Rely 100 Percent on the Tool

Tools are only that. Tailor fields and business rules to your company's needs with internal experts leveraging tools while obtaining the right information to complete data migration.

Validate Throughout the Process

Don't wait until migration is completed to look for problems. Fixing mistakes after the fact are exorbitantly expensive. Carefully choose testing and evaluation personnel based on critical participant and data consumer needs.

Engage the Business

As migration takes place and nears completion, be ever mindful of which users, customers and business partners have to live with the results. Choose carefully and accurately who makes the final decision whether the migration is "good enough."

Measure Impact

Take your time in determining who should be involved in testing, evaluation and final sign-off of the data migration, while being keenly aware of the data's ultimate consumer. After "go-live," focus on operational business processes to maintain relevant, high-quality data.

FEATURED SPONSOR MESSAGE

Microsoft Sponsored Resource Center

Windows Azure is a public cloud platform for building, hosting and scaling applications. Try Windows Azure free for 90 days and get 20GB outbound and unlimited inbound data transfer.

Learn more

Brought to you by


 
Read More

Tuesday, September 4, 2012

How to Skip the Welcome Screen & Logon Automatically on Windows

Imagine if you don't see the logon screen or don't need to enter admin password all the time before going to start screen. Undoubtedly, cherishing and time saving.  Then here is a simple method to perform, which hardly takes less than a minute but worth a lot more. So, if you get bored with those old windows logon screen and want to automatically logon to a user without entering any password, then follow below mentioned steps: Open 'Run' windows from the 'Start' menu or press 'Window button + R' to open it directly.'Run' window will popup where you have to enter 'control userpasswords2', as shown graphically in the figure below & the press 'enter':Again 'User Accounts' Windows will open and you have to uncheck the box stating 'Users must enter a user name & password to use this computer'.Click on 'Apply' to apply the requested settings.You will be asked to enter the user name and password of the account, which you wish to automatically logon as for following reboots of your system.That's all you have to do to get out of bored window logon screen. Remember, to undo this operation, you have to follow again the same steps except to recheck the box that states 'Users must enter a user name and password to this computer'.
Read More

Tips to repair JPEG Images

Most of the camera users, photographers used JPEG or JPG image format to save their photos because of its graphics and photo quality. JPEG stands for Joint Photographic Expert Group is the joint committee between ITU-T (formerly CCITT) and ISO/IEC which creates JPEG Standards. JPEG is the most commonly used photo format all over the world that’s why most the photographers suffer with corruption in JPEG images. Most common corruption reason is when images gets corrupted - plug in your camera in to the switch and the electricity gone or fluctuating or you saved your images in PC/Laptop which is infected with virus or your camera wet due to rain or your file system get corrupted.In all the above reasons you cannot see your images in proper format. See image below In any case of corruption stop using device (Memory Card, SD Card or Hard Drive) instantly it may cause further damage to your photos. In case of camera failures or physical damage in memory card or hard drive the only way to recover photos by any good data recovery company. Logical case is recoverable like if your photos are corrupted and there is not any physical damage in the card or hard drive, some of the JPEG Repair software available in the net those are effective enough to repair corrupt JPG files within a minute.You can see in the image how photos are corrupted and after using JPEG Repair software it will come into its original form.
Read More

Step By Step Guide to Run Startup Repair on Windows 7

Weather the system file is missing or Windows got corrupted, Startup Repair could be the best available option to fix it immediately. Missing system files such as boot.ini and similar can prevent your system to start normally as well as the damage the hardware. This involves a huge risk of loosing your whole data and hardware failure, beware to try on your own at least if you are a computer novice. Though, You can take help of your friend or any technical person who would have similar troubleshooting experience. Startup Repaircan be found under System Recovery Options menu, which is likely to be pre-installed on your OS or provided by your manufacturer. However, If you couldn't find it, you can use your Windows 7 Installation Disc to initiate Startup Repair. Note: As far as data safety is concerned, Startup Repair can only repair or replace missing or deleted Windows files but can't recover any deleted, damaged or formatted files. Moreover, It can't safeguard your data against hardware failure or virus corruption. So, it is highly recommended to take backup regularly or frequently to avoid data loss circumstances. Steps to Run Startup Repair:a) Navigate to the System Recovery optionsusing Windows 7 CD or Startup repair disc. If you don't know how to do this, I've explained it hereSystem Recovery Options b)Click on 'Startup Repair' option. This will start a complete scan of your entire system as well tries and fix problems found during the scan. Your system may start several times as it is searching & fixing system files. C) After successfully scanning the system, Startup repair will shows the results. Alternatively, if any problems aren't found, you need to click on Nextbutton. Note: You can see the results of Startup repair to check what it has fixed by clicking on View Diagnostic & Repair details link. D) Click on restart and see if your problems solved. Otherwise, take help of your manufacture or technical support team.
Read More

Windows Vista Hangs at Black Screen

TxF or Transactional NTFS is a unique component included in Windows Vista that allows you to perform different file operations on NTFS file system volume in the form of transactions. Thus, using Transactional NTFS, you can atomically create, modify, delete and rename files and folders. Additionally, Transactional NTFS ensures that all the operations performed are correct and will be committed only if all operations complete successfully. While the TxF process is running, no other process should interrupt it or else will cause a deadlock condition. Such conditions usually occur due to file system corruption issues and to solve them, you will need to reinstall the operating system. You should use your backup to restore the lost data. However, in case of any backup issues, Data Recovery Software can be used for complete data restoration.You might observe following symptoms with your Windows Vista-based system:When you try to start your Windows-Vista based system, it doesn't boot and hangs displaying a black screen.Attempting to restore the system using WinRE (Windows Recovery Environment) doesn't help as it also stops respondingYou fail to start the system in Safe ModeThe system is unresponsive even if you use Windows Vista installation disc to try repairing the systemCauseA Windows Vista system exhibits the above behavior if a deadlock condition occurs between Windows Vista Autocheck and TxF (Transactional NTFS) processes. It primarily occurs if file system gets corrupted within $TxF directory.SolutionTo resolve this issue, you need to perform clean re-installation of Windows Vista. You can also opt for performing parallel installation of system. The former method causes complete data loss. So, use your latest data backup and restore the entire information.Sometimes, the data backup cannot restore the required information as it damaged or incomplete. To cope up with such situations, you require using Data Recovery tools. These utilities use safe scanning algorithms to scan the logically crashed media and recover all the data, unless it has been overwritten. Data Recovery Software are built with superior technology to provide graphically rich user interface, complete results, secure scanning and more.
Read More
Powered By Blogger · Designed By Data Recovery Specialist