Page 4,409«..1020..4,4084,4094,4104,411..4,4204,430..»

DigitalOcean launches monitoring service for its cloud servers – BGR India


BGR India
DigitalOcean launches monitoring service for its cloud servers
BGR India
US-based Cloud infrastructure provider DigitalOcean on Wednesday launched a monitoring service that provides insight into the resource utilization and operational health of every Droplet (Cloud server). Developers can collect and visualize metrics in ...

View original post here:
DigitalOcean launches monitoring service for its cloud servers - BGR India

Read More..

IBM Bare Metal Cloud Targets AI with New P100 GPUs – HPCwire (blog)

IBM announced today that it will be addingNvidia P100 graphics processors to its Bluemix cloud later this month, becoming the first major global cloud vendor to provide the high-end Pascal GPUs. Big Blue is targeting the new hardware at customers who run compute-heavy workloads, such as artificial intelligence, deep learning, data analytics and high-performance computing.

Unlike Nimbix, the heterogeneous cloud vendor that began offering NVLinkd Nvidia P100 GPUs on the IBM Minsky Power8 platform last October (2016), IBM will be using PCIe form factor cards within an Intel x86 server. This is not really a surprise since IBM operates most of its cloud servers on Intel-based chip sets. Customers will be able to add up to two Nvidia P100 cards to a dual Xeon E5-2690 v3 machine (24-core CPUs running at 2.6 GHz).

The IBM cloud does have some Power server options for specific big data workloads but it does not have an expanded assortment of Power, saysJay Jubran, Global Offering Management for Compute at IBM Cloud. A plan tointegrate Power8 based systems with NVIDIA P100 GPUs into the IBM cloud portfolio is underway. We are are working side by side with the Power Systems team to ensure that IBM Cloud will deliver access to the best of IBM technology to allow customers to run HPC and AI workloads, Jubran told us.

The Power8 Minsky platformenables tight coupling of the Power CPU and P100 GPU overNvidias proprietary NVLink interconnect. Themezzanine form factor P100 also provides nearly 13 percent better raw performance than the PCIe card, 5.3 double-precision teraflops versus 4.7. Both versions provide 16 gigabytes of HBM2stacked memory.Networking on the IBM cloudstands at10 Gigabit Ethernet today with IBM stating thatfuture platforms might go up to 25 Gigabit Ethernet.

IBM will be first to the P100 punch in terms of major cloud providers, but as we have seen, other cloud purveyors are advancing with P100 plays of their own. Heres a rundown:

Nimbix As mentioned above, Nimbix added IBM Power S822LC for HPC systems (codenamed Minsky) to its heterogeneous HPC cloud platform last October. Target markets include high-performance computing, data analytics, in-memory databases, andmachine learning.

Cirrascale On its GPU-driven deep learning infrastructure as a service, San Diego, Calif.-based Cirrascale offers a number of P100-based server configurations, including four-way and eight-way Intel-based GPU servers and IBM Power8 Systemswith two and four GPU options.

Google The Google Cloud platform website states that P100s are coming soon. Google will also be incorporating AMD FirePro S9300 x2 GPUS into its infrastructure. Google began offering K80 GPU-equipped virtual machines (as a beta release) in February of this year.

Microsoft Microsoft last month revealed blueprints for a new open source P100-based accelerator HGX-1 developed under Project Olympus. Its an accelerator box with eight Tesla P100s, connected in the same hypercube mesh as the Nvidia DGX-1 server and also leveraging the NVLink interconnect. The HGX-1 hooks to servers via PCIe interface.Were to assume the boxes, being manufactured by Ingrasys, will show up on Azure but Microsoft hasnt indicated when that will be. The company has had some notable delays in GPU rollouts announcing a planned K80 instance in September 2015, and AWS beating them to general availabilitya year later.

Tencent Two weeks ago, Chinese cloud giant Tencent said it will offer a range of cloud products that will include GPU cloud servers incorporating Nvidia Tesla P100, P40 and M40 GPU accelerators and Nvidia deep learning software. Tencent Cloud launched GPU servers based on Nvidia Tesla M40 GPUs and NVIDIA deep learning software in December; it expects to integrate cloud servers with up to eight Pascal-based GPUs each by mid-year.

Reigning cloud king Amazon does not yet offer Nvidias Pascal-based silicon (the P100 or the P40 inferencing engine). Amazons most recent P2 instance family is backed by Kepler-generation K80 parts, rolled out last September (2016).

IBM emphasized the advantage of its bare metal cloud offering, compared to the multi-tenant environments of AWS and the other mega-cloud providers, especially for HPC workloads. The main reason why people come to IBM cloud, other than the global presence, is the performance and consistency of having access to the bare metal. The bare metal allows us to give better performance than any other virtualized environment with the same specification because we do not have the hypervisor tax which is roughly 10-15 percent of the CPU power, said Jubran.

We find HPC workloads typically find their way to the IBM cloud. If the customer is looking to run HPC on an hourly basis sometimes youll see them go to other clouds, but in terms of monthly consumption we have the best offering in terms of performance and price value, he added.

The bare metal infrastructure is also attractive to the graphics community, for gaming, especially a subset called cognitive gaming, and for engineering, saidJubran. Financial services, healthcare, and retail are all target verticals.

Customers that prioritizehighly elastic resources and pay-by-the-sip pricing typically go to IBMs competitors, Jubran noted, but their core customers are the ones who understand the performance metrics that IBM offers.

We are attracting both digital customers looking for performance, gaming customers and born on the web type customers who are looking for bare metal performance, but scalability of the cloud. And we also get in the higher end of the spectrum in terms of enterprise and that is because of IBM obviously being an enterprise-focused company from day one and they put trust in IBM to bring their workload to our datacenters. So having both aspects of the spectrum keeps us on the innovative side in terms of digital and keeps us on the high-performance secure side for the enterprise, said Jubran.

Aside from the advantage of this enterprise trust factor, IBMs distributed model of 50 datacenters (built up since the Softlayer acquisition in 2013 for a reported $2 billion) gives them the geo-precision to provide local data sovereignty for their customers and is a natural fit foredge computing (important for AI training workflows and for IoT). For many customers, proximity of compute and data are far more important than saving on compute cost offered by the greater elasticity of mega-datacenters. Atypical IBMdatacenter unit consists of roughly 20,000 servers; in the hyperscaler world, thats pretty small.

The Tesla P100 joins Nvidias portfolio of GPU offerings on the IBM Cloud, including the older Tesla K2 GPU, the Tesla M60 for virtualized graphics and the Tesla K80, which IBM added in 2015, about a year ahead of the competition. IBM expects most of its K80 customers will be migrating over to the P100 servers as they begin adding the parts later this month. We also expect newcomers into the AI platform as the P100 is the most powerful GPU in terms of AI workloads that are based on TensorFlow, Caffe, Nvidia SDK or any of the AI SDKs available out today, said Jubran. With so much focus from all the different industries in AI, I think you will see more and more of those workloads coming to IBM cloud and the P100 will enable that. If you look at the Nvidia material for P100 it is the most powerful GPU for both training and inferencing, the two aspects of AI.

With all key deep learning frameworks GPU-accelerated and over 400 HPC applications in a broad range of domains, including the top 10 high performance computing applications, IBM Cloud customers can quickly tap into the power of the our GPU platform to boost performance, accelerate time to results and save money, NvidiasVice President of Accelerated Computing Ian Buck wrote in a blog post.

The cost for the new Pascal-based hardware is $750 per month per P100 GPU card, tacked on to thepriceof the server. This adds a 50 percent premium over the cost of the K80s ($500 per card) but the P100 card offers a 60 percent additional performance improvement over the K80.That should make switching a no-brainer and while IBM wont be forcing customers with active workloads off the K80, they are planning to sunset the older Teslas as inventory depletes.

Editors note April 6, 2017: In an earlier version of this article, we reported (based on information IBM shared with us) that the Power8 Minsky platformwas not on IBMs cloud roadmap. After the article was published, IBM contacted us to let us know that it does have plans toincorporate Power8 based systems with Nvidia P100 GPUs intoits cloud portfolio. We have amended the story to include this updated information.

Original post:
IBM Bare Metal Cloud Targets AI with New P100 GPUs - HPCwire (blog)

Read More..

Financial Industry IT Professionals and Executives Believe Data is Safer in the Cloud than On-Premises – PR Newswire (press release)

Other key results of the survey showed that, on average, financial organizations have around three services (3.25 average) in the cloud. Servers/data centers, Microsoft Exchange and Office, and other SaaS offerings were cited as the top deployed cloud services. Additionally, the survey indicated that adoption will continue to be strong, with 87.5 percent of respondents planning on adding new or additional cloud services in the next three years.

The survey results also found that 53.5 percent of organizations had deployed a cloud solution on their own versus using a third party provider. However, when asked if they had to start the deployment over, nearly half (45.5 percent) said they would outsource to a solution provider the next time.

"Financial organizations have recognized the significant benefits of the cloud and the pace of their adoption, and planned future adoption, is faster than many other industries," said Scott Kinka, Chief Technology Officer of Evolve IP. "Leveraging the Evolve IP Compliance CloudTM for PCI and FFIEC, along with geographically redundant Tier IV data centers, Evolve IP can help these organizations successfully deploy compliant cloud computing and communications services to secure systems and data, drive productivity and improve member and customer experiences."

Additional Findings:

Survey Methodology

The blind, web-based survey was conducted by Evolve IP during January of 2017, featuring over 1,300 respondents in multiple industries across North America. Over 110 of the respondents indicated they were in the financial industry. Of the financial respondents, 72.5 percent came from organizations with between 50 and 5,000 employees, 14 percent from companies with more than 5,000, 10 percent were businesses with 11-49 associates and 3.5 percent were small businesses with 10 employees or less. Evolve IP customers were excluded from the survey.

ABOUT EVOLVE IP

Evolve IP is The Cloud Services Company. Designed from the beginning to provide organizations with a unified option for cloud services, Evolve IP enables decision-makers to migrate all or select IT technologies to its award-winning cloud platform. Evolve IP's combination of security, stability, scalability, and lower total cost of ownership is fundamentally superior to outdated legacy systems and other cloud offerings. Today the company's services, including disaster recovery, IP phone systems / unified communications, contact centers, virtual desktops, IaaS and more, are deployed by more than 1,300 commercial business accounts with a combined 130,000+ users, licensed seats and managed end points. Visit http://www.EvolveIP.net for more information.

PRLog ID: http://www.prlog.org/12631310

To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/financial-industry-it-professionals-and-executives-believe-data-is-safer-in-the-cloud-than-on-premises-300435933.html

SOURCE Evolve IP

http://www.evolveip.net

See more here:
Financial Industry IT Professionals and Executives Believe Data is Safer in the Cloud than On-Premises - PR Newswire (press release)

Read More..

Oracle Claims First Truly Converged Cloud/NAS Storage Platform – Top Tech News

Oracle announced a new platform this week to help enterprises integrate cloud storage services with on-premises NAS storage systems. The platform, called Oracle Cloud Converged Storage, is based on Oracle's ZFS Cloud software, which is included in the latest Oracle ZFS Storage Appliance release.

The company is pitching the solution as the first truly converged storage solution that offers both public cloud and on-premises, high-performance NAS (network attached storage).

The company said the new tools will allow organizations to move data or applications from on-premises storage to the cloud more easily. There's no need for external cloud gateways or software, or cloud access licenses from legacy, on-premises vendors in order to access public clouds from their infrastructure platforms.

Easier Path to Integration

Oracle said its new approach significantly eases the migration burden for IT administrators as they move from their own on-premise platforms to public cloud integration. The difficulty usually comes when trying to integrate disparate environments with different security requirements and a variety of industry standards. Struggling with end-to-end visibility, diagnostics and support can also be a problem.

With its ZFS Cloud, Oracle thinks it can beat both public cloud providers who cannot deliver on-premises, high-performance storage systems, and also traditional hardware vendors who lack truly integrated public clouds. So says analyst Mark Peters of the Enterprise Strategy Group. Peters says Oracle's converged platform delivers "a genuine hybrid data ability with a 'cloud insurance option' built right into the storage system."

Oracle says the convergence of its storage cloud with ZFS Storage Appliances will provide enterprise clients with the same performance abilities as flash storage drives plus the added agility and simplicity of Oracle's cloud storage solution. The company said the converged system can be used for elastic application storage as well as back-up and recovery. Other uses include development, testing, active archive storage, and snapshot replica storage. It can also be used by Dev Ops with a single API for both on-premises and in the Oracle Storage Cloud, and for lift-and-shift workload migration. Newer applications can also leverage data both in Oracle ZFS Storage Appliances (shown above) and in the Oracle Storage Cloud without any application changes.

Cloud Meets On-Premises

"Cloud [technology] is forcing IT practitioners to rethink their organization's infrastructure to accommodate current technology while future-proofing their business for tomorrow," said Steve Zivanic, Oracle's VP of Storage and Converged Infrastructure. "By converging the Oracle ZFS Storage Appliances with Oracle Storage Cloud, organizations benefit from the highest performing storage systems for their on-premises needs, while seamlessly extending them to Oracle Cloud resources when necessary. Oracle ZFS Cloud is the unifying enabler that helps customers bridge the gap between their current infrastructure and plans for broader public cloud adoption."

The company said the new update also includes a series of innovations to the Oracle ZFS Storage Appliance that extend Oracle Database dynamic automation capabilities. Oracle claims these updates can increase database administrator productivity by as much as ten times, as well as add all-flash pools to accelerate business applications.

The platform also includes new storage protocols to help automate storage tuning and cloud-scale data protection, with more than 62TB per hour of data backup.

Read the original:
Oracle Claims First Truly Converged Cloud/NAS Storage Platform - Top Tech News

Read More..

ENTERPRISE HARDWARE Oracle Claims First Truly Converged Cloud/NAS Storage Platform – CIO Today

Oracle announced a new platform this week to help enterprises integrate cloud storage services with on-premises NAS storage systems. The platform, called Oracle Cloud Converged Storage, is based on Oracle's ZFS Cloud software, which is included in the latest Oracle ZFS Storage Appliance release.

The company is pitching the solution as the first truly converged storage solution that offers both public cloud and on-premises, high-performance NAS (network attached storage).

The company said the new tools will allow organizations to move data or applications from on-premises storage to the cloud more easily. There's no need for external cloud gateways or software, or cloud access licenses from legacy, on-premises vendors in order to access public clouds from their infrastructure platforms.

Easier Path to Integration

Oracle said its new approach significantly eases the migration burden for IT administrators as they move from their own on-premise platforms to public cloud integration. The difficulty usually comes when trying to integrate disparate environments with different security requirements and a variety of industry standards. Struggling with end-to-end visibility, diagnostics and support can also be a problem.

With its ZFS Cloud, Oracle thinks it can beat both public cloud providers who cannot deliver on-premises, high-performance storage systems, and also traditional hardware vendors who lack truly integrated public clouds. So says analyst Mark Peters of the Enterprise Strategy Group. Peters says Oracle's converged platform delivers "a genuine hybrid data ability with a 'cloud insurance option' built right into the storage system."

Oracle says the convergence of its storage cloud with ZFS Storage Appliances will provide enterprise clients with the same performance abilities as flash storage drives plus the added agility and simplicity of Oracle's cloud storage solution. The company said the converged system can be used for elastic application storage as well as back-up and recovery. Other uses include development, testing, active archive storage, and snapshot replica storage. It can also be used by Dev Ops with a single API for both on-premises and in the Oracle Storage Cloud, and for lift-and-shift workload migration. Newer applications can also leverage data both in Oracle ZFS Storage Appliances (shown above) and in the Oracle Storage Cloud without any application changes.

Cloud Meets On-Premises

"Cloud [technology] is forcing IT practitioners to rethink their organization's infrastructure to accommodate current technology while future-proofing their business for tomorrow," said Steve Zivanic, Oracle's VP of Storage and Converged Infrastructure. "By converging the Oracle ZFS Storage Appliances with Oracle Storage Cloud, organizations benefit from the highest performing storage systems for their on-premises needs, while seamlessly extending them to Oracle Cloud resources when necessary. Oracle ZFS Cloud is the unifying enabler that helps customers bridge the gap between their current infrastructure and plans for broader public cloud adoption."

The company said the new update also includes a series of innovations to the Oracle ZFS Storage Appliance that extend Oracle Database dynamic automation capabilities. Oracle claims these updates can increase database administrator productivity by as much as ten times, as well as add all-flash pools to accelerate business applications.

The platform also includes new storage protocols to help automate storage tuning and cloud-scale data protection, with more than 62TB per hour of data backup.

Originally posted here:
ENTERPRISE HARDWARE Oracle Claims First Truly Converged Cloud/NAS Storage Platform - CIO Today

Read More..

Elastifile Harnesses the Flash Storage and the Cloud for Hybrid Workloads – Enterprise Storage Forum

Santa Clara, Calif. technology startup Elastifile launched a data storage platform today that combines the high-performance characteristics of flash with the scalability and flexibility of the cloud.

The company's offering, dubbed the Elastifile Cloud File System, is "hardware agnostic [and] optimized for flash," Amir Aharoni, CEO and co-Founder of Elastifile, told InfoStor. Moreover, the cloud-enable storage platform is not aimed at the "secondary storage and backup" requirements of organizations that are commonly relegated to the cloud, but rather their mission-critical workloads, he added.

The technology draws from the know-how of its executive leadership, which hails from companies that helped pave the way for the advanced networking, flash storage and virtualization technologies used in today's modern data centers.

Elastifile was founded in 2013 by Aharoni, formerly of Mobixell Networks and Optibase, and current CTO Shahar Frank, co-founder of enterprise flash storage pioneer XtremIO, which was acquired by EMC in 2012. Fellow co-founder Roni Luxenburg was an executive at virtualization specialist Qumranet, which was snapped by Red Hat, and Pentacom, a Cisco acquisition.

To date, the company has raised more than $50 million, including a $35 million Series B funding round in early 2016. Battery Ventures, Lightspeed Venture Partners, Cisco and Western Digital are among the company's backers. Early customers include Innova, Sigma Vista and the European Bioinformatics Institute.

Flash storage and cloud computing may have gained mainstream acceptance, but their potential to power elastic infrastructures, in which data can move around in an unencumbered and on-demand manner, remains unfulfilled, according to Andy Fenselau, vice president of marketing at Elastifile. The reality is that despite software-defined methods and other advances, data remains trapped in traditional storage arrays, newer hyperconverged appliances and other silos for many enterprises, he said.

Elastifile Cloud File System, using the company's Cross-Cloud Data Fabric technology, helps usher businesses into more of a self-service model, a "world where applications and their data can move dynamically across sites, across clouds," Fenselau. "Users are in charge."

The product employs the Bizur consensus algorithm, a patented distributed metadata model and adaptive data placement techniques, to provide cloud-enabled storage services that are capable of handling transactional workloads with latencies in the one- to two-millisecond range.

Crucially, Elastifile enables businesses to "lift and shift" or cloud-burst their applications without refactoring them, Aharoni said. Elastifile supports the "big three" cloud providers, namely Amazon Web Services (AWS), Google Cloud Platform and Microsoft Azure. A single, global namespace helps streamline management. Data deduplication and compression is included at no cost.

Elastifile Cloud File System is available now. Pricing is based on a consumption-based subscription model that spans both on-premises and cloud storage managed by Elastifile.

Pedro Hernandez is a contributing editor at InfoStor. Follow him on Twitter @ecoINSITE.

Visit link:
Elastifile Harnesses the Flash Storage and the Cloud for Hybrid Workloads - Enterprise Storage Forum

Read More..

The best NAS drives – PC Advisor

Network attached storage drives let you access files from any device on your network. Check out our NAS drive reviews and buyers guide. Store files across your network with 10 of the best NAS drives

By Benny Har-Even | 05 Apr 17

NAS drives are like cloud storage: you can access all your files from anywhere, both inside and outside of your home or office. You can use them to store and play your music and video collections, as well as documents and other files.

NAS stands for Network Attached Storage and as thename suggests it enables you to have a large amount of storage connected directly to your broadband router. This storage is therefore available to all your devices.

NAS drives are designed to be turned on permanently, which means you can have access to your music, movies, photos and documents at all times. Most have timers so you can set them to turn on and off during the hours you want.

One of the most popular reasons to buy an NAS drive is for media playback. Videos can be viewed on your TV, without having to connect a computer.

An NAS drive will use much less power than a regular PC, too,making itmuch cheaper to run. For ease of setup and ease of use, a dedicated NAS drive is hard to beat.

So what should you look for when choosing one to buy?

The first requirement is capacity. You'll need one that has enough storage to meet your needs now and in the future. Plenty of NAS drives come with no disks at all - these are known as diskless or bare drives. The advantage is that you can choose the drives you want and how much capacity you need.

You can now get disks up to 10TB is size, though for youll be paying at least 400 or so for the privilege. 4TB disks are arguably the current sweet spot, at around 120.

When you choose your disks, look for ones that have been designed to work specifically with NAS boxes. NAS-optimised features include more secure construction providing more resistance to vibration, which makes a lot of sense for a drive thats designed to be on the whole time. They also offer power management so they can adjust performance based on their temperature.

These drives also offer special features in firmware known by WD as TLER (Time-Limited Error Recovery) and by Samsung and Hitachi as command completion time limit (CCTL). This optimises the error correction for drives when they are installed in a RAID array (explained below) as is usually the case with NAS drives.

RAID stands for redundant array of inexpensive disks. RAID can be quite complex but at a basic level youll want to use it primarily to provide redundancy so if a disk fails your data is still safe. There are many variants but three of the most popular are known as RAID 1, RAID 5, and RAID 6.

Most NAS drives will offer at least two bays, which means that you can set them up as RAID 1. In this scenario the second drive is a mirror of the first, so if one drive fails completely all your data is safe on the other. You can then replace the faulty disk, and rebuild the RAID array (this will take many hours).

RAID 5 requires at least three drives and offers parity data. That means a RAID 5 array can withstand a single drive failure without losing data or access to data. As data is striped across three drives, reads are fast, but at the expensive of slower writes because of having to also write the parity data.

RAID 6 meanwhile requires four drives but offers both striped and dual parity, so two drives could fail and the RAID could still recover.

Whichever you choose however, dont consider RAID to be your only backup of your data. First, youre relying on the RAID array rebuilding successfully, and while from experience we know that it does work, it is another point of failure.

If the box just dies, or if something catastrophic happens like a fire, youll still lose all your data. To mitigate this you'll want another external backup, preferably to the cloud. Most NAS drives offer native applications for certain providers, but these will require subscription to the service and will not necessarily be from your preferred one.

Another feature to look out for is hot-swap capability, which enables you to take out or add a drive without having to power down first, which could be important if youre running business applications off your NAS and want to maintain uptime when replacing or adding a drive.

You should also consider whether youll need remote access to the drive. Previously this required signing up to a third-party DNS service, but these days with most NAS drives you can just sign up for an account with themanufacturer as you set up the drive. Login to the account and they'll handle the connectivity to your box at home. If privacy is a concern you many not wish to go down this route, but for ease of use it's the way to go.

Its also worth considering how powerful you need your NASs processor to be. The dedicated operating systems that NAS drives run arelightweight, but a faster processor and more memory will enable features such as transcoding.

This means that any media files will be converted on the fly into a playable format, so you dont have to rely on your client device being able to play the files smoothly. For example, HEVC H.265 files are becoming popular due to the small file sizes, but devices (aside from the latest 4K TVs) that can play this back natively are still uncommon.

Transcoding will deal with this for you if your NAS is powerful enough. However, if you have 4K files and want to play these on all your devices youll need a fast NAS.

Finally, you might want to consider to what use you'll be putting your NAS. As well as media a small business user will want to know what applications it has to offer, such as setting it up as an email server, a VPN server, or using it to host a website.

All-in-all the 216+II NAS matched up to our expectations from Synology and will be a very good choice for home or small business use. If you arent confident about installing hard disks, then this is the box to get as its easy and doesnt require tools. Theres a huge range of applications to choose from the processor SoC offers plenty of horsepower to run them on too and it all runs quietly. With its fantastically easy installation, setup, app support and general ease of use the Synology is a very solid choice. However, if you like the idea of direct hook up via HDMI you may be swayed by the slightly pricier QNAP TS-251A.

Read our Synology 216+II NAS review.

We liked the Asustor AS1004T for its ease of installation, it's relatively quiet operation in normal use and its decent performance. It isn't fast enough for hardware transcoding though, so youll need native support for all your files on all your client devices. Where it trumps the completion is that if offer a four-bay chassis where others at a similar price offer only two. If storage rather than performance is the priority then, it's a great choice and while the ADM interface isn't as accomplished looking as some of its rivals it's got the apps you'll likely need.

Read our Asustor AS1004T review.

The QNAP is an undoubtedly impressive NAS drive. Theres plenty of power for virtually all tasks, and H.265 aside it will handle anything you throw at it. The range of apps is very comprehensive and the interface is excellent. The downside is the lack of support for MKV from its native app, which will mean having to pay for Plex to play files on mobile devices. The unit was also noisier than we would have liked in operation and while its good value - its not cheap. If youre willing to stretch to paying this much for a diskless system, the QNAP TS-251A is the best featured NAS drive at the price.

Read our QNAP TS 251A review.

When it comes to ease of use the WE My Cloud Mirror is hard to beat. Initial setup is very easy and even sorting our remote access is simple too. For sharing music, movies, photos and documents it works a treat and performance is fine. The downside is that you dont get the huge range of apps that are available for other brands. However, if you prioritise ease of set up and ease of use the WD is worth looking at and with 4 TB of storage included for the price, its a great value option.

Read our WD My Cloud Mirror 4TB review.

Synology has made headlines with its new cut-price DS115j, and its recommended retail price of just 78. The performance has also been cut, along with useful features like USB 3.0, but if you need these the DS114 is still in the range for around 140. And if you really would rather not spend that, the cheaper DS115j will take on basic storage tasks, and still perform faster than some more expensive competition.

Read our Synology DS115j review.

The Netgears physical design and is very impressive, but we were troubled by issues that meant it lacked the appeal of drives weve tested from QNAP and Synology. Not all disks can be installed in a tool-less fashion and the interface for installing and using apps isnt the best weve seen, not is the range of choices. Performance is good, but the ARM processor doesnt quite have chops to handle 4K transcoding. Its a good NAS, but it would need to be cheaper for us to recommend it over the competition.

Read our Netgear ReadyNAS RN212 review.

The 216play will likely be a disappointment to 214play owners wondering about an upgrade. It makes sense only if you have - or will soon have - lots of 4K content that you need to transcode on the fly. Its performance is good, but if you don't need real-time transcoding, you may want to opt for a different DiskStation (or indeed another NAS entirely) which has the extra ports and SD slot which the 216play lacks.

Read our Synology DiskStation DS216play review.

The Synology DS414j may not be the most glamorous of NAS drives, if indeed there is a candidate leader, but it is well-made and packs just enough power to not embarrass itself in basic benchmark tests for its file-serving speed. That it runs the same carefully wrought and versatile operating system as its dearer brethren is a definite plus, making it suitable for small-scale business use as well as being turned to home entertainment duties.

Read our Synology DS414j review.

Synology's RRP for the DS415play is 372 and at that price or the inevitably lower real shop prices the company should have a winner on its hands. The competing QNAP TS-469L is faster and has better specifications but is over 100 more expensive. When you combine the performance, price and the siren-like draw of DSM 5.0 this could be a crowd pleaser for the multimedia NAS market.

Read our Synology DS415play NAS review.

The WD My Cloud EX2 has a few minor faults, but it's easy to use and provides good performance and reliability at an attractive price. There are more sophisticated NAS drives available for larger businesses, but the EX2 provides all the features that home users and small businesses are likely to need, and presents them in a straightforward manner that will appeal to people who might not have used a NAS drive before.

Read our WD My Cloud EX2 review.

View post:
The best NAS drives - PC Advisor

Read More..

Security Still Biggest Cloud Storage Concern For Business Execs – Silicon UK

Cloud security concerns and the early announcements from Teradata Universe 2017

Security is still the number one concern among senior business executives when it comes to storing data in the cloud, despite adoption continuing to accelerate across industries.

According to a survey carried out by big data and analytics firm Teradata, more than half of business-critical data is likely to reside in the cloud by 2019, but 80 percent of executives still cite security as the biggest deterrent.

However, these worries will not stop the acceleration of cloud adoption. Fifty-sixpercent pf IT data is predicted to reside in the cloud by 2019, followed by customer (53 percent) and financial data (51 percent).

In terms of specific reasons that are holding back the storage of critical data in the cloud,40 percent of respondents said general security is a risk, while 25 percent believed cloud data adoption will result in more security breaches.

Our message to organisations around the world is that the cloud is actually one of the most secure means of virtual storage available, said Marc Clark, director of cloud strategy and deployment at Teradata. While our study finds widespread concerns, the fact is that cloud storage is growing rapidly, remains hugely cost-effective, and that there are ways to manage it securely.

Cloud computing security processes should be designed to address the security controls that the cloud provider will incorporate, in order to maintain the data security, privacy and compliance with necessary regulations, as well as providing a business continuity and data backup plan.

When it comes to different industries, healthcare and telecoms are two sectors making big moves. 59 percent of respondents highlightedthe healthcare industry as one that will significantly increase adoption over the next two years and 48 percent of telecoms organisations anticipate a significant increase in their cloud storage use by 2019.

Along with the above survey, Teradata has made multiple product announcements at its Universe 2017 event currently taking place in Nice, France.

The first is a flexible database license model across hybrid cloud deployments, providing portability for deployment flexibility, subscription-based licenses and simplified tiers with bundled features. In a nutshell it means that customers will have the flexibility to choose, shift, expand, and restructure their hybrid cloud environment by moving licenses between deployment options.

John Dinning, Chief Business Officer at Teradata said: Not only is the database license portable across the hybrid cloud options, but so are workloads, enabled by a common code base in all deployments. This flexibility is a first in our industry and means that data models, applications, and development efforts can be migrated or transferred unchanged across any ecosystem.

The second announcement is an updated customer journey platform that will give marketers easier access to customer path analytics, dynamic communication journey visualisations, machine learning and predictive simulations to provide customer insights.

Marketers will be able to use the data collected from both online and offline sources to increase response rates and reduce churn. Importantly, it will also enable greater personalisation, something which customers have come to expect in todays omni-channel world.

In this release of Customer Journey we are putting more analytics into the hands of marketing, so they can build a deeper understanding of customer experiences and then proactively optimise related journeys, said Dan Harrington, headTeradatas Consulting & Support Services.

Finally, there was the launch of an all-memory updated to Teradatas Intelliflex platform, which moves completely to a solid state drive (SSD) to deliver increases in performance, storage density and energy efficiency.

By transitioning to all SSDs we now provide an all-memory appliance capable of delivering up to seven times the compute power per cabinet of our previous product plus rapid performance elasticity that is simply unmatched in our market, saidOliver Ratzesberger, EVP and chief product officer at Teradata.

We are providing our customers with more performance, more storage, and more memory in the same footprint, and at half the energy consumed per unit of performance delivered.

Think youre cloud savvy? Take our quiz and find out!

More here:
Security Still Biggest Cloud Storage Concern For Business Execs - Silicon UK

Read More..

Cloud Computing Moves to the Edge – Data Center Knowledge

By Ernest Sampera is Chief Marketing Officer for vXchnge.

In a time when we all expect instant access to our personal and professional networks, its never been more important to have the right technology and strategies in place for supporting todays advanced users, applications, and data. From business applications like ERPs and Salesforce to the ability to post to Facebook with zero lag time, decreased latency is becoming a must-have as business users and consumers demand new levels of efficiency and speed.

To remain competitive and meet the growing demand for more responsive services, IT departments are leveraging edge computing. Edge computing enables companies to put the right data in the right place at the right time, supporting fast and secure access. The result is an improved client experience and, oftentimes, a valuable strategic advantage.

The transition to edge computing is being driven by three rapidly evolving, and often overlapping, dynamics: the growth of IoT, the pace of technology-empowered business, and evolving user expectations.

IoT usage is poised to explode, with over 50 billion things projected to be connected to the Internet by 2020. In fact, the IoT is the most commonly cited reason for a move to an edge computing architecture, as more than80 percent of IT teams want their data centers to be more available and reliable to keep pace with IoT demands. Edge computing enables faster real-time analysis and lower costs for managing, analyzing and storing IoT data.

Today, almost every company in every industry sector needs near-instant data to be successful. Restaurant chains need to know where their food product is coming from, when it expires, and when it will arrive on their doorstep. A mistake in the supply chain could have consequences that range from losing a loyal customer to a food safety crisis that results in food-borne illness. Retail stores need to know what customers bought yesterday, how much they spent, and what they are looking to buy next. In the financial sector, milliseconds can make a dramatic difference for high-frequency trading algorithms. And, in healthcare, real-time patient information can be the difference between life and death. These scenarios require speed and scale to support latency-sensitive, machine-to-machine data.

When it comes to consumers, expectations are high, and brands must be prepared. Edge computing allows businesses with a geographically dispersed customer base to deliver the exceptional availability consumers demand, while also enabling data to be shared across the globe instantly. It also enables businesses with remote or branch offices to replicate cloud services locally, improving performance and productivity.

According to a recent BI Intelligence report, the manufacturing, utility, energy and transportation industries are expected to adopt edge computing first, followed by smart cities, agriculture, healthcare and retail.

Seventy-nine percent of IT teams feel that having customers closer to their content is the most important benefit of a data center. Utilizing an edge data center in markets close to customers means companies can provide better service, with less physical distance and minimal latency.

When choosing an edge data center provider, organizations should look for providers committed to standards such as ISO 27001, HIPAA, or SAE 16 Type II, depending on their particular industry. A data center that is certified can provide peace of mind to companies and their customers that their sensitive data, and ultimately their brand, is protected.

The decision to implement an edge computing architecture is typically driven by the need for location optimization, security, and most of all, speed.

The importance of speed to every business operation cannot be overstated. Its no longer a competitive advantage; its a necessity. Todays data management systems require the most immediate information to support in the moment decisions that can have an impact of millions of dollars to the bottom line. By bringing processing to the edge of the network, businesses reduce latency by prioritizing processing and lightening the load on the primary network, supporting better, faster decision-making.

Location optimization reduces data processing from minutes and hours to milliseconds and microseconds and as close to real time as you can currently get. Less physical distance translates to minimal latency and greater reliability. Allowing customers in Nashville to receive the same speed and level of service as those in New York is one example of what edge computing can enable.

While cloud computing wont be slowing down anytime soon, edge computing is finding its place in IT architectures. Cloud computing and edge computing provide significant, yet different, benefits, and smart IT strategists will be sure take full advantage of both.

Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.

Read more here:
Cloud Computing Moves to the Edge - Data Center Knowledge

Read More..

China Telecom wins bid for government cloud computing contract for only RMB 0.01 – TechNode (blog)

China Telecom recently won a bid to provide IT services to a government information center in Liaoyang, a third-tier city in Liaoning Province, for as little as RMB 0.01. This has sparked controversy for alleged price distortion and unfair competition, local media is reporting (in Chinese).

The Liaoyang city government has set aside RMB 8.93 million for the procurement of hardware needed in the cloud computing and big data processing platforms of its information center, among others, according to an online procurement announcement published by the Liaoyang government. In addition, the bid winner is requested to provide routine maintenance for the platforms for a period of 10 years.

China Telecoms bid underscores the rising competition in the countrys cloud computing sector, no stranger to such practice. Last month, Chinese internet giant Tencent reportedly won a government cloud service contract with an RMB 0.01 bid, in its attempt to expand its foothold in the countrys huge cloud market as well as wrest market share from Alibabas cloud computing unit Aliyun,estimated by Morgan Stanley Researchto have grabbed half of the countrys US$ 2 billion public cloud market (in Chinese).

Telecom equipment maker ZTEs unit ZTE Soft Technology made a similar move last January by putting in an RMB 0.01 bid for a real-time communication system contract for the Ministry of Public Security, which also provoked the ire of its competitors.

Apart from its telecom peers, China Telecom is also facing ever-increasing pressure from internet firms, as the rapid expansion of these firms has taken a toll on its profits in recent years. The companys 2016 net profit plummeted 10.2% year-on-year to RMB 18 billion, according to a financial report it recently released (in Chinese).

Industry observer Xiang Ligang held that the company is using the lowballing strategy to pave the way for their future development, as there may be some value-added or additional services extended from the current project in the future.

Cloud computing projects are usually constructed in stages, and there will be expansion projects once a phase one project is completed, Xiang added.

Sheila Yu is a Shanghai-based technology writer. She brings readers the biggest news from Chinese language tech media. Reach her at sheila@technode.com.

Go here to read the rest:
China Telecom wins bid for government cloud computing contract for only RMB 0.01 - TechNode (blog)

Read More..