Category Archives: Cloud Storage
Calm down, everyone it’s only a Monday storage news blitz – The Register
Storage news, like data growth, is unrelenting.
This week we learned of new things in the cloud, tape, new storage arrays, object storage, persistent storage for containers, cloud storage gateways, deduping to disk, and ultra-secure disk drives and SSDs. Ready? Here we go.
DataDirect Networks (DDN) has updated its WOS software to WOS Core 2.8 and WOS S3 2.3 (for the connectivity features), claiming its competitive cost-wise with tape for archiving. We asked if comparative numbers were available.
A DDN spokesperson said: WOS is competitive with tape storage alternatives and gets close to the cost of tape for active archives. DDN hasnt published figures and isnt referring to the cost of a tape removed and on a shelf, but rather to tapes in a library with an automated method for loading tapes (a robot) and sufficient performance to be viewed as an adequate alternative.
Thats a No, then.
DDN also told more about its new Extended ObjectAssure erasure coding: Extended ObjectAssure is the application of a fairly common model for data protection for Object Storage and when combined with WOS existing selection of data protection options, offers the most expansive flexibility and choice in the industry.
It builds on WOS ObjectAssure technology to provide data protection at Local ObjectAssure efficiency while providing highest availability of data. Extended ObjectAssure erasure encoding divides the object into k+m fragments, and each fragment is stored on a separate node. Data remains available when up to m fragments are available. WOS supports a wide variety of Extended ObjectAssure configurations, where:
This allows customers to get to very efficient overhead numbers for parity, especially for wide and deep codes for large data repositories.Offering this option alongside our existing Local Object Assure (LOA), Global Object Assure (GOA) and Replication now gives customers the ultimate ability to manage data protection based on the value of the data and the desired performance, latency, MTTDL and availability.All the data protection options are applied by policies at the object level, and can be intermixed within a cluster.
There was an advance in S3 connectivity with the update as well, from single to multi-site connectivity. DDN said it did this because We consistently hear about customers not being able to meet replication SLAs with S3.
Customers can now access the WOS repository via S3 in multiple locations, with centrally managed credentials (including integration with LDAP and Active Directory), as well as fully protect the repository in case of local outage or disaster, with the ability to failover and failback once a primary site is restored. The solution we are announcing is very high performance and will allow our customers to meet their stringent SLA for replication.
Exagrid has released V5.0 of its deduping, disk-based backup software. It includes better support for Oracle RMAN Channels, Veeam Scale-Out Backup Repository (SOBR), and replication to AWS public cloud for disaster recovery.
Oracle RMAN customers can use RMAN Channels with up to 25 appliances in an ExaGrid scale-out GRID system, and enjoy global deduplication, for what its worth in that database environment. Sections of data are sent to each appliance in parallel for improved performance and also for performance load balancing, as RMAN Channels will send the next section of data to the next available appliance in the GRID.
Exagrid CEO Bill Andrews exclaimed: ExaGrid v5.0 is the first backup storage solution that provides fast backups at a rate of up to 200TB/hour per PB, and fast restores with the ExaGrid landing zone coupled with performance load balancing and failover when working with Oracle RMAN. He reckons, There is no solution on the market that will come close to the ExaGrid approach for Oracle RMAN.
He tells The Reg, 80 per cent of our business now comes from replacing Commvault deduplication to disk, Veritas 5200/5300 appliances, Data Domain, HP StoreOnce and Quantum DXi, which is threatening to those suppliers if Exagrids business keeps on growing.
It is outpacing the deduping to disk backup market, with leader Data Domain maturing yet still unable to offer global deduplication. Exagrid reckons it could achieve an IPO in 2019, with Andrews saying: We are heads-down focused on a 2019 IPO. We have met with many investment banking firms and know exactly what we need to do.
Fujitsu and Oracle are offering Oracle public cloud services from a Fujitsu data center in Japan. Customers can subscribe to Fujitsu Cloud Service K5 DB powered by Oracle Cloud. The Oracle Database Cloud Service is one of the database options available from Fujitsu Cloud Service K5.
Fujitsu and Oracle formed a strategic alliance on July 6, 2016 to deliver enterprise-class cloud services to customers in Japan and their subsidiaries around the world.
A canned quote from Edward Screven, Chief Corporate Architect at Oracle, said: By combining Fujitsus system integration expertise with Oracles cloud services, Fujitsu and Oracle will accelerate the transition of our joint customers enterprise systems to cloud. Fujitsu has the largest number of Oracle-certified Oracle Cloud engineers in Japan.
This should provide a timely boost to Oracle as it focuses on growing its public cloud customer base.
Mpstor has a new all-flash array, the OSA-F60, with these features:
We prodded Mpstor to clarify a few things about the array. Here's the results of that probe:
El Reg What is the arrays raw capacity?
Mpstor We support a total of 60x15.3TB drives in a 2U Chassis for a total RAW capacity of approx 900TB.
El Reg What CPUs do the controllers use? (including number of cores)
Mpstor We use a 10 core Intel Xeon CPU E5-2670.
El Reg 128GB DRAM per controller for the two controllers?
Mpstor We use 128GB per controller.
El Reg What is the arrays latency?
Mpstor At 1M IOPS the latency is less than 2 ms.
El Reg Why not 32Gbit/s Fibre Channel?
Mpstor We support up to 16G FC, the next version will support 32G FC.
El Reg The release says field replaceable disks but its an all-flash array?
Mpstor Every one of the 60 SSDs are hot swap FRU devices and user accessible.
El Reg The release says its available in a range of SAN configurations including 8x8G-FC, 4x16G-FC, 4x48G-Ethernet or 4x56G-Infiniband. but it only supports 40GbitE not 48.
Mpstor That was a typo, its 40G Ethernet.
El Reg The release says using an automation API or conventional GUI but there is no such thing as an automation API, only an API used for automated access.
Mpstor Correct, the OSA-F60 is delivered with a set of python CLI tools to automate the provisioning process such as RAID creation, volume creation and volume exports to hosts, as well as all monitoring functions.
El Reg An Mpstor webpage says the OSA-F60 supports SAS expansion JBOD enclosures, expansion enclosures can be either 2Ux60 2.5-inch SSD or 4Ux60 3.5-inch HDD drive enclosures. But its described as an all-flash system. Why is there disk drive support there?
Mpstor The OSA-F60 is an all flash SSD array, the OSA-F60 software can manage HDDs as well. The option is there for a user to add additional capacity either as SSD or HDD.
El Reg What's the array cost?
Mpstor The price is less than US$2.00 per GB and less than $0.50 per GB effective (assuming a conservative dedup ratio of 4:1).
El Reg Can we buy one now?
Mpstor Units are available now.
Cloud storage gateway supplier Panzura says London-based global architecture practice Sheppard Robson has replaced a near end-of-life NetApp filer infrastructure with a hybrid cloud storage setup featuring Nimble hybrid flash arrays, Amazon AWS and Panzura Freedom NAS, for a simpler, faster and less expensive option than deploying primary, backup and archive storage at each of its sites. It shares a set of files from the cloud instead.
The customer had separate NetApp storage products in place in addition to an offsite disaster recovery co-location. The NetApp arrays handling unstructured data have been de-commissioned and the DR site closed down. VM clones have moved off the NetApp arrays to (HPE) Nimble arrays.
Sheppard Robson IT director Simon Johns said; Weve been able to shut down an entire data centre. And weve been able to provide collaboration between offices that we couldnt do previously.
UK-based StorageOS has announced the availability of the public beta release of its software storage which delivers highly available databases in containerised apps. lts free to developers on the Docker Store and has recently achieved Docker Certified status for its managed volume plugin.
The StorageOS beta software has no hardware or kernel dependencies and can be installed in the cloud, in VMs, or bare metal with one command.
Deni Connor, founding analyst for SSG-NOW, thinks using containers for production is becoming more prevalent and the market will be worth $2.7 billion by 2020. Persistent storage for containers should see a rise in demand if that happens.
Register for the public beta here.
Originally posted here:
Calm down, everyone it's only a Monday storage news blitz - The Register
Government Cloud Storage: Its Uses and Benefits – FedTech Magazine
Businesses all over the world are using cloud computing technologies for data storage. Indeed, IDC found in July 2016 that, in a survey of 6,000 IT organizations, nearly two-thirds were either already using or planning to use public cloud Infrastructure as a Service by the end of 2016. IDC expects public cloud IaaS revenues to more than triple, from $12.6 billion in 2015 to $43.6 billion in 2020.
But is the federal government turning to cloud storage? And if so, what is this cloud storage being used for?
As agencies deploy more sensors as part of the Internet of Things, they are collecting more data. All that data needs to be stored somewhere. Simultaneously, agencies are trying to meet requirements to consolidate and optimize their data centers. The cloud can help on both of those fronts by providing virtually unlimited storage, which allows agencies to shutter data centers as they move apps and data off of physical servers and into the cloud.
It costs between $4 and $100 to manage the storage of a single gigabyte of unstructured data over the course of its lifetime, according to a March 2016 study from Enterprise Strategy Group. The costs can vary depending on the type of storage, the number and salaries of administrators, the criticality of the data and whether it is subject to compliance mandates.
Distinguishing between different types of data and determining the datas value is critical. Technologies such as ControlPoint and Structured Data Manager can help agencies sort legacy data and determine its value.
As agencies better categorize existing data, they must also determine how storage-worthy data will be used, as FedTech reported. The frequency with which certain data is used and the required confidentiality of each data set can influence such decisions. That kind of analysishelps IT teams better determine where to store each data set, such as within a private, public or hybrid cloud solution.
Ultimately, federal customers will use a hybrid cloud environment, Rob Stein, vice president forNetAppsU.S. public-sector division, told FedTech last year. I talk to a lot of federal CIOs, and data storage is usually one of the top five things they want from the cloud.
The Defense Department, Department of Homeland Security and NASA are the agencies that are spurring spending on, and adoption of,Internet of Things sensors across the federal government,according toa report last year from Big Data and analytics firm Govini. In the years ahead, civilian agencies will likely deploy such sensors in greater numbers, but various partnerships will be needed to migrate defense-related technologies to those agencies.
Govini breaks down the IoT into two main categories: infrastructure, which involves equipment to enable the exchange of information between sensors, the cloud and devices; and software, which refers to applications that facilitate the transmission, storage and analytics of sensor-collected data. Cybersecurity is key for wireless devices, cloud storage and sensors, as well as software elements like data processing and device-based apps.
The federal IoT market is experiencing steady growth despite a slight dip in fiscal 2013 due to sequestration, as spending increased by 20 percent to $8.8 billion in fiscal year 2015, up from $7.4 billion in fiscal 2014, according to Govini.
A separate 2016 report from Govini found that the federal cloud services market is booming, and cloud storage is poised to see growth in particular as more agencies deploy IoT sensors that generate data that then needs to be stored.
Annual federal cloud spending increased by 24.8 percent to $3.3 billion in fiscal 2015, up from a five-year low of $2.6 billion in fiscal 2012, the report notes. IaaS, the largest market segment, is driving federal cloud spending. IaaS spending increased by 53.3 percent to $897.2 million in fiscal 2015 from $585.2 million in fiscal 2012, according to Govini.
Cloud storage is driving a big part of that growth, the report found. Cloud storage includes the major cloud computing models IaaS, Email as a Service, Platform as a Service (PaaS) and Software as a Service (SaaS) and the services that support their deployments.
The shifting of traditional on-premises apps such as email and document sharing to the cloud is also driving the growth in cloud storage. The Department of Health and Human Services had been a reluctant cloud adopter, Federal News Radio reported. Now, however, it is embracing the cloud with gusto.
HHS migrated its email to the cloud and now wants to adopt Microsofts Office 365 to offer more capabilities, including cloud storage for documents. Cloud storage gives agencies greater flexibility, enhances productivity and provides unlimited storage capabilities.
The next thing we are trying to do is the One-Drive capability so you can have your documents anywhere. Then, obviously, coupling that with the Office 365 in the cloud it will make it virtual anyplace, Killoran told Federal News Radio. Not only will we have unlimited storage but we will be able to utilize those capabilities on multiple types of devices and to be able to share information more collaboratively than in the past,
Shifting to the cloud can also help agencies shutter data centers and cut costs. TheData Center Optimization Initiative, which the Office of Management and Budget codified last summer, encouraged the adoption of virtualized servers and cloud services. DCOI will likely encourage the shift to software-defined data centers.
In an indication of how the cloud is reshaping data centers, the Army has well over 1,000 data centers but plans to shutter the vast majority of them by moving apps and data to the cloud.
In January, the Army finalized a long-awaited contractwithIBM, worth $62 million, to build, manage and operate a cloud solution at the Armys Redstone Arsenal, near Huntsville, Ala.
Tim Kleppinger, vice president and senior client partner of IBM U.S. Federal, told AL.com that the initial group of apps that will be migrated to the cloud has been selected, and that some of the migration work will be done by the organizations that actually run the data and some will be done by IBM or other contractors.
Personnel, logistics, support and some finance apps will be migrated, according to Kleppinger. Just a lot of legacy applications, he said. That is the whole purpose of the cloud: reduce the amount of data, reduce the amount of apps, reduce the number of centers.
Read more here:
Government Cloud Storage: Its Uses and Benefits - FedTech Magazine
‘Cloud Storage And Connectivity Is Key To Photography’ – BW Businessworld
Canvera Digital Technologies that creates photo albums has collaborated with the worlds two most famous album design software brands, Pixellu and Fundy. This is for the first time that an Indian printing company has had its specs loaded onto these international software for a seamless transaction between album designs to album print. Through this association, Canveras plan is unifocused to take pro photographers into the digital world of smart and efficient working; build an eco-system for design services that includes internal growth and external collaborations. We focus on the needs of our partners and device strategies to help them enhance their experience. The association is one step closer to the digital design ecosystem and empowers our partners with new trending software, helping them with an array of design templates to choose from. In addition to this they get a fast and seamless digital work flow, everything done at the click of the mouse, says Ranjit Yadav, Managing Director, Canvera Digital Technologies.
Edited Excerpts:
Tell us about the journey so far? The company was started by Dr Dhiraj Kacker and Peeyush Rai. We have raised Rs 20 crore in its Series B funding round in 2016 from Info Edge with participation from existing investors Footprint Ventures and other undisclosed participants. Canvera has built up a strong franchise with professional photographers in the most demanding sector now. We help them become more productive and effective with our best in class services. Canvera has set up industry standards for fine quality of printing photobooks which encompass silver halide, inkjet and Indigo. Canveras plan is unifocused; to take professional photographers into the digital world of smart and efficient working; build an eco-system for design services that includes internal growth and external collaborations like Fundy & Pixellu. We are generating effective ways of working through our apps and online tools. Over the time, we have built strong consumer franchise through our classified offering to connect consumers and photographers.
How has the consumer choice changed over the years? What do people look for in a design service solution? Consumers these days are very particular about their preferred choice of photographers. They are looking for the right type as per their specific requirement who is approachable, consultative and most importantly possess creative approach. Many times people cant decide on a photography style as per the occasion. Every photographer has their own creative bent. So it is important to hire a photographer who blends his creativity with consumer needs, resulting in beautiful shots. Since the sector is evolving multifold, there is no dearth of photographers to suit ones budget. The only thing required is browsing through the right channels while looking for a professional photographer.
Can you throw some light on the design and print industry today? The ability to handle the photo and imaging requirements is an acquired skill. The growing trend is towards short-run printing and cloud based solutions. Brands are moving to build an ecosystem that bring together the photographers, printing experts and design services as a wholesome experience. Print on the go is needed. Canvera is a leading player in the industry which offers excellent design and printing facility under one roof.
What according to you are the technological advancements beneficial for the sector? The proliferation of images via smartphones and lens technology is growing; the option of cloud storage and connectivity; availability of design software, photo editing apps, etc help in improving the images dramatically. Printing them is the final proof, rest everything is online and in the virtual world.
What is your outlook for this sector? The sector is bright. It is an under-served market for both professional photographers and consumers. There is a need to work digitally and enable strong growth of data/ broadband penetration. The market is growing 15-25% annually due to increased requirements/ spends on weddings and special occasions. And the consumer market is ripe for expansion.
What are the future plans? We have planned lots of new launches in product and service categories in 2017-18. We are building a digital ecosystem within photography industry and collectively along with other industry experts investing in the overall growth of the category in Indian market.
Originally posted here:
'Cloud Storage And Connectivity Is Key To Photography' - BW Businessworld
A Beginner’s Guide on How Cloud Backup With Amazon S3 Works – Cloudwards
Amazon S3 may stand for simple storage service but figuring out how to get started with it can seem anything but. Thats mostly due to the fact that S3 was designed to help developers build cloud-computing tools and is just one of about 70 different services included in the Amazon Web Services (AWS) platform.
During this guide, well give you a quick overview of how storage with Amazon S3 works and how to set it up. Well also show you how you can build your cloud storage and backup strategy around S3 with minimal work using friendly integration software like that developed by CloudBerry Labs.
Among cloud storage services, few have as many data center regions 14 around the world as Amazon S3. Thats because Amazon S3 is built on the same infrastructure used by the Amazon.com shopping platform.
The advantage of being able to choose a region near you is that it decreases latency, which translates to faster transfers to and from the cloud. A second key advantage of using S3 is that youre only charged for what you use. That helps make S3 much more scalable than many other cloud storage options, like Backblaze B2.
Before you get started setting up S3, youll want to understand the fees, of course. The nice thing is that youre only charged for what you use, which helps control costs. At the same time, not understanding how charges are accrued can lead to some unpleasant surprises.
Amazon charges both a flat per gigabyte storage and a usage rate for various transactions. The rates vary by region but not much, except in South America where theyre around double.
Heres a look at the U.S. East (Virginia) rates:
Standard and standard infrequent access are two different storage classes offered by S3. Storage classes let you control costs even more by reducing base storage rates for data you dont need to access often.
Theres a third S3 storage class called reduced redundancy storage which decreases the number of copies of your data stored. Plus, theres a separate cloud storage service called Amazon Glacier thats designed for archiving and disaster recovery.
For usage charges, any uploads to Amazon S3 are free. Retrievals are charged per gigabyte per month:
Amazon also charges usage for other transactions, although if youre just using S3 for sync or backup, youll never need to worry about them.
With that out of the way, lets set you up with Amazon S3.
To get started with S3, head to the AWS console page and click the sign in to the console button.
Youll be redirected to the login page where you can either sign in or create an AWS account. If you already have an Amazon.com account, you can just login with those credentials.
Now that youre logged into AWS, you need to find S3 from among the 70 plus services available. You can browse the listings under all services to find it under the storage heading, or just type S3 into the search bar at the top of the page.
Youll be sent to the S3 management console, where you can set up your cloud storage by creating a storage bucket.
With AWS, by default, any account can create up to 100 cloud storage buckets. If you need more, you can submit a service limit increase request.
Click the create bucket button to get started.
Youll be asked to enter a name for you bucket and select a region.
Any name you enter must be unique, meaning accounts held by others cant have the same name. Also, Amazon enforces DNS-compliant naming conventions in all areas except for the U.S. East region.
Here are the DNS-compliant naming rules according to Amazon:
The region you select will determine which data center your bucket is kept in. Choose a region close to you for faster speeds.
Once youre done, click the next button to set up your bucket properties.
There are three key properties you can set now or later:
Next up, you can set permissions to allow others to read or write to your S3 bucket.
Permissions can be set for both objects and object permissions. Objects are files or folders. Object permissions enable or disable read and write capabilities of the access control list (ACL), which lets others control permissions, too.
The final step is simply a review of everything youve set.
If everything looks good, click create bucket.
Now that your bucket is created, you can start storing data in it. Back on your S3 homepage, clicking on your new bucket will let you do that.
Youll be redirected to an interface for managing your bucket with four tabs at the top: objects, properties, permissions and management. For the moment, well stay on the objects tab.
Click on the get started button at the bottom of the tab. A pop-up window will open that will walk you through the upload steps.
The first step requires that you click the add files button. You can then browse your file system to find and upload files.
You can only upload one file at a time and cant upload folders. So, the process is going to be slow going if you manage your S3 cloud storage this way. Well address this issue shortly, so stick with us.
The next step lets you set permissions and is identical to step during the bucket creation process. Step three is to set properties.
The storage class property lets you change your bucket from standard, to standard-inactive, to reduced redundancy.
Encryption lets you choose whether or not Amazon encrypts your data at rest in the cloud. There are two options: an S3 master key or an Amazon KMS master key. KMS is Amazons key management service. For a cost, it lets you created and manage your own encryption keys for added security and compliance reasons.
Click on next to review your settings before uploading your content by clicking the upload button. The process should be pretty quick. We measured speeds between 1MB/s and 1.5MB/s, which is better than most cloud storage services.
Back on the main page, you can also create folders under the objects tab to organize your cloud storage space.
Once youve got data stored in S3, you can adjust the properties and permissions you set at any time by visiting the tabs for either category.
The properties tab includes a few additional elements youll want to be aware of. These include static website hosting, event monitoring and data copying across different Amazon S3 regions.
A fourth tab called management includes options for life-cycle management. Life-cycle management lets you automatically transfer data from one storage class to another. For example, transferring files from standard storage to Amazon Glacier, which is used for archiving.
You can also set data to delete automatically. Both transfer and delete transactions can be configured to happen based on a set of rules you define.
The management tab also hastools for analytics, metric and inventory.
While setting up Amazon S3 buckets turns out to be pretty easy, after all, having to backup one file at a time isnt going to work for most people. Unlike typical cloud storage and cloud backup services, theres no way from within the S3 console to automatically sync or backup multiple files. Thats where good third-party software comes in handy.
You can find software for both home and office use of varying degrees of cost, performance and features. Well use perhaps the most recognized, and a personal favorite, developer as an example: Cloudberry Labs.
Cloudberry Labs produces three different types of software that integrate with Amazon S3:
We wont go into depth on these the capabilities of these tools here, but lets take a quick look at how easily they integrate with S3 using CloudBerry Box as an example. If youd like more information, check out our CloudBerry Backup review.
Once you install and start CloudBerry Box, it sets up a sync folder on your desktop called CloudBerryBox. You can connect that sync folder to S3 (or another storage service) via the desktop app.
With Amazon S3, you need to generate keys to activate this connection. Back in the AWS portal, click on your account name on the top-right side and select my security credentials from the drop-down menu.
On the next page, click on the line for access keys.
Click on the create new access key button. This will generate both the access key ID and secret access key you need to connect CloudBerry Box to Amazon S3. Once you close this screen, you wont be able to retrieve your secret key again. You can download a key file if you want for safeguarding.
Input the keys in the respective fields in the CloudBerry Box application. Then select the cloud storage bucket you want to connect to in the bucket name field.
To check the connection, hit the test connection button.
Thats all there is to it. Going forward, anything put in the CloudBerryBox sync folder should upload to the cloud. The same basic approach will let you connect Cloudberry Explorer, CloudBerry Backup and similar services to S3.
Amazon S3 probably isnt going to be the solution most home consumers will want to go with. $0.023 per gigabyte per month for storage equates to $23 per terabyte. Meanwhile, cloud storage services like Sync.com give you 2TB of storage for $8 per month. Cloud backup services like CrashPlan give you unlimited backup for $5.99 per month.
Sign up for our newsletter to get the latest on new releases and more.
However, S3 has its uses, particularly for entrepreneurs, developers and businesses looking for more flexibility, features and scalability. While the service isnt quite as easy to set up as some consumer options, the added perks you get by integrating it with powerful third-party software like CloudBerry make it worth the effort.
Wed love to hear about your own experiences setting up Amazon S3 and what third-party tools you use to streamline your storage process. So, feel free to let us know in the comments below. Thank you for reading.
Read more:
A Beginner's Guide on How Cloud Backup With Amazon S3 Works - Cloudwards
Microsoft Invades the Cloud With Israeli Mega-Storage Service, Cloudyn – LearnBonds
TechCrunch | Microsoft Invades the Cloud With Israeli Mega-Storage Service, Cloudyn LearnBonds Cloudyn is a start-up that is making headway in the realm of cloud services. Cloudyn operates primarily from, Tel Aviv. Cloudyn makes its mark in the remote services segment by analyzing and optimizing the cloud storage and cloud services used by ... Report: Microsoft may buy cloud monitoring startup Cloudyn for $50-70M Get started with Microsoft Azure Functions |
Go here to see the original:
Microsoft Invades the Cloud With Israeli Mega-Storage Service, Cloudyn - LearnBonds
Making the Leap to Space-Based Cloud Storage | @CloudExpo #Cloud #Storage #Telecom – SYS-CON Media (press release)
The Need for Speed - Making the Leap to Space Based Cloud Storage
Enterprises and governments rely upon the timely transfer of mission-critical information to keep their projects and operations flowing smoothly. To do so requires a complex chain of communications hubs working efficiently, and even when they are working optimally, there can still be seconds of lag time between communications. That may not seem like much, but being able to get information a second faster than the competition (or the enemy) can make a huge difference.
Current technology, which can require multiple hops and interchanges through terrestrial networks, slows information down while at the same time exposing it to monitoring and manipulation along the way. Even the most efficient cloud network requires third-party data centers to replicate globally to provision worldwide offices effectively.
What if data could be securely transmitted from a single corporate network hub to any location worldwide in less than a second? The solution to this problem will be found through new space-based data center technology, creating a telecom backbone around the globe. Such a network will allow data to flow freely around the world without restriction and without fear of interception, enabling CIOs to virtually provision any remote office in less than one third of a second, regardless of proximity, without any latency, jurisdictional or cybersecurity issues.
Applications at the Speed of Space New technologies have been designed that can provide an independent space-based network infrastructure for cloud service providers and their enterprise and government customers to experience secure storage and provisioning of sensitive data around the world. By placing data on satellites that are accessible from everywhere via ultra-secure dedicated terminals, many of today's data transport challenges will be solved. Space-based storage offers a convenient solution to the issues of both security and jurisdiction while offering unprecedented transit speed.
Organizations and government entities already enjoy the communications benefits of the satellites ringing the earth. Using the technologies that would enable space-based cloud storage, they can enjoy even faster and more secure communications and offer services that would not otherwise be possible.
Space-based network infrastructure will expedite point-to-point delivery drone audio and video. At present, there is a latency of more than two seconds in the delivery of real-time drone video. Like driving a car with a two-second blinder, maneuverability and agility are constrained. Using a sky-based telecom system, latency will be reduced to less than one second.
The space-based network would expedite 4K HDTV between two live audiences. Currently, studios employ parlor tricks to mimic live two-audience interaction, and video error correction must be applied at each server stop in both directions to meet the studio's demanding 4K HDTV specifications. Using a space-based system, latency would be reduced to about one second and require just one video error correction at the end. Studios would be enabled for true live audience interaction - a major market differentiator.
Expedited live video delivery would also become a reality, and video streaming services would be able to bypass congested, expensive networks.
Cloud service providers will be able to sell services without adding more capital or operational expenditures for competitive expansion, including:
The Final Storage Frontier As cloud migration enjoys widespread adoption, many organizations find that their shared storage space creates congestion and downtime. Reduced speed creates logistical hurdles for live video and other data types. However, by removing the multi-hop system currently in place, space-based storage and transmission offers the speed that public and private organizations need for their mission-critical data.
@DevOpsSummit atCloud Expo taking place June 6-8, 2017, at Javits Center, New York City, and is co-located with the 20th InternationalCloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world.
DevOps at Cloud Expo / @ThingsExpo 2017 New York(June 6-8, 2017, Javits Center, Manhattan)
DevOps at Cloud Expo / @ThingsExpo 2017 Silicon Valley (October 31 - November 2, 2017, Santa Clara Convention Center, CA)
Download Show Prospectus Here
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
@DevOpsSummit will expand the DevOps community, enable a wide sharing of knowledge, and educate delegates and technology providers alike. Recent research has shown that DevOps dramatically reduces development time, the amount of enterprise IT professionals put out fires, and support time generally. Time spent on infrastructure development is significantly increased, and DevOps practitioners report more software releases and higher quality. Sponsors of@DevOpsSummit will benefit from unmatched branding, profile building and lead generation opportunities through:
For more information on sponsorship, exhibit, and keynote opportunities, contactCarmen Gonzalez by email atevents (at) sys-con.com, or by phone201 802-3021.
The World's Largest "Cloud Digital Transformation" Event
@CloudExpo / @ThingsExpo 2017 New York(June 6-8, 2017, Javits Center, Manhattan)
@CloudExpo / @ThingsExpo 2017 Silicon Valley (Oct. 31 - Nov. 2, 2017, Santa Clara Convention Center, CA)
Full Conference Registration Gold Pass andExhibit Hall Here
Register For @CloudExpo Here via EventBrite
Register For @ThingsExpo Here via EventBrite
Register For @DevOpsSummit Here via EventBrite
Sponsorship Opportunities
Sponsors ofCloud Expo /@ThingsExpo will benefit from unmatched branding, profile building and lead generation opportunities through:
For more information on sponsorship, exhibit, and keynote opportunities, contactCarmen Gonzalez (@GonzalezCarmen) today by email atevents (at) sys-con.com, or by phone201 802-3021.
Secrets of Sponsors and Exhibitors HereSecrets of Cloud Expo Speakers Here
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades.
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend@CloudExpo |@ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA. Learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
Track 1. FinTech Track 2. Enterprise Cloud | Digital Transformation Track 3. DevOps, Containers & Microservices Track 4. Big Data | Analytics Track 5. Industrial IoT Track 6. IoT Dev & Deploy | Mobility Track 7. APIs | Cloud Security Track 8. AI | ML | DL | Cognitive Computing
Delegates toCloud Expo /@ThingsExpo will be able to attend 8 simultaneous, information-packed education tracks.
There are over 120 breakout sessions in all, with Keynotes, General Sessions, and Power Panels adding to three days of incredibly rich presentations and content.
JoinCloud Expo /@ThingsExpo conference chair Roger Strukhoff (@IoT2040), June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA for three days of intense Enterprise Cloud and 'Digital Transformation' discussion and focus, including Big Data's indispensable role in IoT, Smart Grids and (IIoT) Industrial Internet of Things, Wearables and Consumer IoT, as well as (new)Digital Transformation in Vertical Markets.
Financial Technology - or FinTech - Is Now Part of the @CloudExpo Program!
Accordingly, attendees at the upcoming 20thCloud Expo /@ThingsExpo June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA will find fresh new content in a new track called FinTech, which will incorporate machine learning, artificial intelligence, deep learning, and blockchain into one track.
Financial enterprises in New York City, London, Singapore, and other world financial capitals are embracing a new generation of smart, automated FinTech that eliminates many cumbersome, slow, and expensive intermediate processes from their businesses.
FinTech brings efficiency as well as the ability to deliver new services and a much improved customer experience throughout the global financial services industry. FinTech is a natural fit with cloud computing, as new services are quickly developed, deployed, and scaled on public, private, and hybrid clouds.
More than US$20 billion in venture capital is being invested in FinTech this year.@CloudExpo is pleased to bring you the latest FinTech developments as an integral part of our program, starting at the 20th International Cloud Expo June 6-8, 2017 in New York City and October 31 - November 2, 2017 in Silicon Valley.
@CloudExpo is accepting submissions for this new track, so please visitwww.CloudComputingExpo.com for the latest information.
Speaking Opportunities
The upcoming 20th International@CloudExpo |@ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA announces that itsCall For Papers for speaking opportunities is open.
Submit your speaking proposal today! Here
Our Top 100 Sponsors and the Leading "Digital Transformation" Companies
(ISC)2, 24Notion (Bronze Sponsor), 910Telecom, Accelertite (Gold Sponsor), Addteq, Adobe (Bronze Sponsor), Aeroybyte, Alert Logic, Anexia, AppNeta, Avere Systems, BMC Software (Silver Sponsor), Bsquare Corporation (Silver Sponsor), BZ Media (Media Sponsor), Catchpoint Systems (Silver Sponsor), CDS Global Cloud, Cemware, Chetu Inc., China Unicom, Cloud Raxak, CloudBerry (Media Sponsor), Cloudbric, Coalfire Systems, CollabNet, Inc. (Silver Sponsor), Column Technologies, Commvault (Bronze Sponsor), Connect2.me, ContentMX (Bronze Sponsor), CrowdReviews (Media Sponsor) CyberTrend (Media Sponsor), DataCenterDynamics (Media Sponsor), Delaplex, DICE (Bronze Sponsor), EastBanc Technologies, eCube Systems, Embotics, Enzu Inc., Ericsson (Gold Sponsor), FalconStor, Formation Data Systems, Fusion, Hanu Software, HGST, Inc. (Bronze Sponsor), Hitrons Solutions, IBM BlueBox, IBM Bluemix, IBM Cloud (Platinum Sponsor), IBM Cloud Data Services/Cloudant (Platinum Sponsor), IBM DevOps (Platinum Sponsor), iDevices, Industrial Internet of Things Consortium (Association Sponsor), Impinger Technologies, Interface Masters, Intel (Keynote Sponsor), Interoute (Bronze Sponsor), IQP Corporation, Isomorphic Software, Japan IoT Consortium, Kintone Corporation (Bronze Sponsor), LeaseWeb USA, LinearHub, MangoApps, MathFreeOn, Men & Mice, MobiDev, New Relic, Inc. (Bronze Sponsor), New York Times, Niagara Networks, Numerex, NVIDIA Corporation (AI Session Sponsor), Object Management Group (Association Sponsor), On The Avenue Marketing, Oracle MySQL, Peak10, Inc., Penta Security, Plasma Corporation, Pulzze Systems, Pythian (Bronze Sponsor), Cosmos, RackN, ReadyTalk (Silver Sponsor), Roma Software, Roundee.io, Secure Channels Inc., SD Times (Media Sponsor), SoftLayer (Platinum Sponsor), SoftNet Solutions, Solinea Inc., SpeedyCloud, SSLGURU LLC, StarNet, Stratoscale, Streamliner, SuperAdmins, TechTarget (Media Sponsor), TelecomReseller (Media Sponsor), Tintri (Welcome Reception Sponsor), TMCnet (Media Sponsor), Transparent Cloud Computing Consortium, Veeam, Venafi, Violin Memory, VAI Software, Zerto
About SYS-CON Media & EventsSYS-CON Media (www.sys-con.com) has since 1994 been connecting technology companies and customers through a comprehensive content stream - featuring over forty focused subject areas, from Cloud Computing to Web Security - interwoven with market-leading full-scale conferences produced by SYS-CON Events. The company's internationally recognized brands include among othersCloud Expo (@CloudExpo),Big Data Expo (@BigDataExpo),DevOps Summit (@DevOpsSummit),@ThingsExpo (@ThingsExpo),Containers Expo (@ContainersExpo) andMicroservices Expo (@MicroservicesE).
Cloud Expo, Big Data Expo and @ThingsExpo are registered trademarks of Cloud Expo, Inc., a SYS-CON Events company.
See the rest here:
Making the Leap to Space-Based Cloud Storage | @CloudExpo #Cloud #Storage #Telecom - SYS-CON Media (press release)
Global Business Cloud Storage Market Report in terms of its Vendors, Types, Regional Distribution and Applications – PRWire (press release)
Big Market Research has recently added a new report, titled, Global Business Cloud Storage Market Research Report 2017. It provides insights on the historic period and forecast period, 20122017 and 20172022 respectively. The study offers a comprehensive analysis of the current industrial trends, drivers, opportunities, and key market players. The research provides an extensive information about the industry to key vendors and stakeholders and assists them to take necessary steps to achieve growth in future.
The global business cloud storage market report offers an overview and scope of the product. The data provided in the study is represented through tables and figures. It depicts the business cloud storage market through figure. There is a pictorial and tabular illustration of the global production market share based on type in 2015. It shows revenue generated and growth rate of each region during the period, 20122022. The research represents the global storage capacity of each manufacturer in table for the period, 20152016. In addition, it describes average price of the industry by key market players during the period, 20152016. In the report, a figure illustrates the industrial share by the top three vendors and another figure shows the market share by the top five vendors. It mentions the storage capacity by regions in tabular format for the historic period.
Segmentation of the market
The report segments the global business storage market into type, application, and geographical distribution. Based on type, the industry is divided into, more than 5TB, 1TB to 5TB, 100GB to 1TB, and less than 100GB. The market finds its applications in backup storage solution, data access & movement solution, primary storage solution, and cloud storage solution. Based on regions, the study classifies the market into
India,
Southeast Asia,
Japan
China
Europe
North America.
Furthermore, the research depicts this categorization of the industry in tabular and pictorial format. The production and revenue generated by each region during the historic period are also mentioned. The study explores consumption, import, and export by each region during the aforementioned period. The experts also analyzed the industry based on application for the historic period.
Ask For Discount @ https://goo.gl/DTS5CN
Read the original post:
Global Business Cloud Storage Market Report in terms of its Vendors, Types, Regional Distribution and Applications - PRWire (press release)
Backblaze drops download pricing for its B2 storage platform by 60% – TechCrunch
Its been just under a year since Backblazes B2 cloud storage service came out of beta. The platforms main selling point at the time was definitely its pricing, which undercut virtually all of its larger competitors (think AWS, Google and Microsoft). Today, its launching a new round of price cuts, this time focused on download cost. Instead of $0.05 per gigabyte, Backblaze will now charge only $0.02.
Typically, when we talk about the price of cloud storage, we talk about how much it costs to store a gigabyte of data on those various platforms. Often, though, the real cost for many companies is actually getting this data out of those services either tomove their backups or to serve their customers.
On Googles cloud platform, this kind of network egress costs $0.12 per gigabyte for the first one terabyte of downloads (with prices dropping after that). As far as network egress goes, Google tends to be pretty pricey. Microsoft and Amazon tend to charge less, but even their pricing starts at $0.09 per gigabyte.
For storage, we spent a decade building our custom Storage Pods and Vault cloud storage file system, and growing a culture that focuses onsqueezing costs out at every layer,Backblaze CEO Gleb Budman told me when I asked him about how his company was able to afford this price drop. For bandwidth, though, it turned out that its just not as expensive as other companies are pricing it at. As soon as we realized that, we lowered the pricing.
Budman also told me that over 50,000 people in 20,000 organizations now actively use B2.
Read more here:
Backblaze drops download pricing for its B2 storage platform by 60% - TechCrunch
Dataguise Strengthens Sensitive Data Governance in the Cloud with Support for Google Cloud Storage – insideBIGDATA
Dataguise, a leader in sensitive data governance, announced that DgSecure Detect now supports sensitive data detection on Google Cloud Storage (GCS). Integration with GCS extends the range of platforms supported by DgSecure Detect, which helps data-driven enterprises move to the cloud with confidence by providing precise sensitive data detection across the enterprise, both on premises and in the cloud. With DgSecure Detect, organizations can leverage Googles powerful, simple, and cost-effective object storage service with a complete understanding of where sensitive data is locatedan important first step to ensuring data protection and privacy compliance.
DgSecure Detect discovers, counts, and reports on sensitive data assets in real time within the unified object-based storage of GCS. The highly scalable, resilient, and customizable solution precisely identifies and summarizes the location of this data, down to the element level. DgSecure allows organizations to comb through structured, semi-structured, or unstructured content to find any data deemed sensitive by the organization. The range of sensitive data that is discoverable by DgSecure Detect is nearly unlimited using the solutions custom sensitive data type definition capabilities.
Sensitive Data Detection Capabilities for Google Cloud Storage:
These new capabilities enable enterprises from a range of industriesincluding finance, insurance, healthcare, government, technology and retailto gain accurate insight on where sensitive data resides in GCS so it can be protected properly. DgSecure helps organizations comply with regulatory mandates for PII, PHI, and PCI data, such as the European Unions General Data Protection Regulation (GDPR), the Health Insurance Portability and Accountability Act (HIPAA), and other data privacy and data residency laws.
With support for GCS, Dataguise provides broad cross-platform support of sensitive data detection within the industrys most popular data repositories and platforms, both on premises and in the cloud, said JT Sison, VP, Marketing and Business Development, Dataguise. Demonstration of DgSecure Detect at Google Cloud Next will be the first public display of the technology, and we invite attendees to meet with Dataguise and Google regarding this innovative solution.
Sign up for the free insideBIGDATAnewsletter.
Originally posted here:
Dataguise Strengthens Sensitive Data Governance in the Cloud with Support for Google Cloud Storage - insideBIGDATA
CORRECTING and REPLACING ZeroStack and Nexenta Offer … – Yahoo Finance
MOUNTAIN VIEW, Calif.--(BUSINESS WIRE)--
Please replace the release dated April 11, 2017, with the following corrected version due to multiple revisions.
The corrected release reads:
ZEROSTACK AND NEXENTA OFFER CONVERGED CLOUD/STORAGE SOLUTION
ZeroStack Teams up with Nexenta to Create Integrated Private Cloud Solution that Reduces Cloud Operations & Storage Costs for Enterprise and Cloud Workloads
ZeroStack, the leader in making self-driving private cloud affordable for all companies, and Nexenta, the global leader in Open Source-driven Software-Defined Storage (OpenSDS), today announced a joint solution that integrates ZeroStacks Intelligent Cloud Platform with Nexentas storage systems to create a pre-tested, completely automated, and fully supported converged private cloud solution. With this solution, enterprises and managed service providers can now leverage Nexentas industry-first hardware and protocol-agnostic Software-Defined Storage (SDS) portfolio, delivering complete freedom from storage hardware vendor lock-in, to build a highly resilient and high performing cloud for application development, running packaged enterprise applications and hosting.
The combined ZeroStack/Nexenta solution offers these unique advantages:
ZeroStack makes on-premises cloud simple and affordable, and this solution allows our customers to combine Nexenta solutions with the ZeroStack platform, said Tarkan Maner, Chairman & CEO at Nexenta. Our OpenSDS solutions give customers the storage agility they need, and ZeroStacks cloud platform extends storage into the cloud for self-service use on a self-healing infrastructure.
Both Nexenta and ZeroStack will market the solution to their customers and resellers. With this combined solution, Nexenta and ZeroStack resellers can offer their customers strategic advice on cloud and storage options while retaining customers who might otherwise have no choice but move to a public cloud provider.
Nexenta has a unique storage solution for enterprises that want high performance and scalability, said Ajay Gulati, CEO and Co-Founder at ZeroStack. By combining our products into a single converged solution, we give our customers the fastest, most reliable access to data in a turnkey on-premises cloud solution.
Helpful Links
ZeroStack Inc.
ZeroStack Inc. Blog
ZeroStack Inc. on Twitter
About ZeroStack
ZeroStack uses smart software and artificial intelligence to deliver a self-driving, fully integrated private cloud platform that offers the agility and simplicity of public cloud at a fraction of the cost. On premises, ZeroStacks cloud operating system converts bare-metal servers into a reliable, self-healing cloud cluster. This cluster is consumed via a self-service SaaS portal. The SaaS portal also collects telemetry data and uses artificial intelligence to create models that help customers make decisions about capacity planning, troubleshooting and optimized placement of applications. The integrated App Store enables one-click deployment of many applications that provide the platform for most modern cloud native applications. This solution is fully integrated with public clouds to offer seamless migration between clouds. Founded by senior engineers from VMware and Google, the company is funded by Formation 8 and Foundation Capital, and is based in Mountain View, California. For more information, visit http://www.zerostack.com or follow us on Twitter @ZeroStackInc.
View source version on businesswire.com: http://www.businesswire.com/news/home/20170411005270/en/
View post:
CORRECTING and REPLACING ZeroStack and Nexenta Offer ... - Yahoo Finance