Category Archives: Cloud Storage

Uncovering the Long-Term Cloud Archive Equation – TV Technology

Deciding, for the long run, where to keep an organizations media-centric assets (i.e., original content, EDLs, finished masters, copies, versions, releases, etc.,) is reshaping the ways of storing or archiving data. Continued popularity in cloud-based solutions for ingest/content collection, playout from the cloud and processing in a virtual environment leads many to rethink storage in the cloud.

Getty Images/lvcandy

Yet cloud concerns still leave the door open to storage alternatives, tier-based migration, automated management and more.

What are the possible alternatives: Backup or archive? On-prem or in a co-lo? Private, hybrid or public cloud? Knowing the differences could create changes in how to approach archive management, regardless of the size, location or types of data libraries.

BACK IT UP OR ARCHIVE IT

Lets look first at the differences in backup vs. archive.

Backup is a duplicate copy of data used to restore lost or corrupted data in the event of unexpected damage or catastrophic loss. By definition, all original data is retained even after a backup is created; original data is seldom deleted. Most backup just in case something happens to the original version, while for others, it is a routine process mandated either by policy or because theyve previously suffered through a data disaster and pledge never to live through that again.

On a small scale, for a local workstation or laptop, practice suggests a nightly copy of the computers data be created to another storage medium, e.g., a NAS or portable 4-8 TB drive. Travel makes this difficult, so alternative online solutions prevail.

Businesses routinely backup their file servers (as unstructured data) and their databases (as structured data) as a precaution against a short-term issue with data on a local drive being corrupted. Snap-shots or images of an entire drive (OS, applications and data) are often suggested by administrators, software vendors, and portable hard drive manufacturers.

Incremental backups, whereby only new data or any which was changed since the last backup, are made due to the larger storage volumes and the time required for the full backup.

STORAGE AS A SERVICE

Archived data could be placed on a local NAS, transportable disk drive, an on-prem protected storage array partition, or linear data tape. Since an archive is about putting the data on a shelf (so to speak), the choices vary based on need.

Cloud archiving is about storage as a service, and is intended for long-term retention and preservation of data assets. Archives are where the data isnt easily accessed and remains for a long uninterrupted time.

Archiving used to mean pushing data to a digital linear tape (DLT) drive and shipping those tapes to an iron mountain-like storage vault. In this model, recovering any needed data was a lengthy process involving retrieving the information from a vault, copying it to another tape, putting it onto a truck and returning the tape to the mothership where it was then copied back onto local storage, indexed against a database (MAM), and then made available on a NAS or SAN.

This method involves risks, including the loss or damage to tapes in transport, tapes which went bad or possibly had corrupted data onto the tape in the first place. Obsolescence of either the tape media or the actual drives meant that every few years a refresh of the data tapes was required, adding more risk in data corruption or other unknowns.

As technology moved onward, robotic tape libraries pushed the processes to creating two (tape) copiesone for the on-prem applications and one to place safely in a vault under a mountain somewhere. While this reduced some riskssuch as placing duplicate copies in storage at diverse locationsit didnt eliminate the refresh cycle and meant additional handling (shipping back tapes in the vault for updating). Refresh always added costs: tape library management, refresh cycle transport costs, tape updates including drives and physical media, plus the labor to perform those migrations and updates.

STORAGE OVERLOAD

Images keep getting larger. Formats above HD are now common, quality is improving, pushing native format editing (true 4K and 5K) upwards, added to dozens to hundreds more releases per program and keeping storage archive equations in continual check. As files get larger, the amount of physical media needed to store a master, such as protect copies and one or more archive copies of every file, is causing storage overload. Decisions on what to archive are balanced against costs and the unpredictable reality that the content may never need to be accessed again.

High-capacity, on-prem storage vaults can only grow so largea hardware refresh on thousands of hard drives every couple years can be overwhelming from a cost and labor perspective. Object-based storage is solving some of those space or protection issues but having all your organizations prime asset eggs in one basket is risky and not very smart business. So opens the door to cloud-based archiving.

MANY SHAPES AND SIZES

Fee-based products such as iCloud, Carbonite, Dropbox and such are good for some. These products, while cloud based, have varying schemes and work well for many users or businesses. Private iPhone users get Apple iCloud almost cost-free but with limited storage sizes. Other users prefer interfaces specifically for a drive or computer device with an unlimited storage or file count.

So why to pick one service over another? Is one a better long term solution versus another? Do we really want an archive or a readily accessible copy of the data in case of an HDD crash?

One common denominator to most commercial backup/archive services is they keep your data in the cloud. Data is generally accessible from any location with an internet connection and is replicated in at least three locations. However, getting your data back (from a less costly archive) has a number of cost-and non-cost-based perspectives. Recovering a few files or photos is relatively straightforward but getting gigabytes of files back is another question. So beware of what you sign up for and know what youre paying for and why.

Beyond these common points is where the divisions of capabilities, accessibility, cost, serviceability, and reliability become key production indicators which in turn drive differing uses or applications.

ACCESS FROM THE CLOUD

Cloud-stored data costs have a direct relationship to the accessibility and retrieving of that data.

If you rarely need the data and can afford to wait a dozen hours or more for the recoverythen choose a deep-storage or cold storage solution. If you simply need more physical storage and intend regular daily or weekly accessthen you select a near term (probably not an archive) storage solution. Options include On-Prem (limited storage, rapid accessibility) or off-prem in a Co-Located (Co-Lo) storage environment, referred to as a private-cloud or alternatively a public (or commercial) cloud provider such as Google, Azure, AWS, IBM or others.

Cloud services continue to grow in usage and popularity, yet, there remains a degree of confusion as to which kind of cloud service to deploy and for which kinds of assets. Many users prefer regular access to their archived materialthis would be a wrong approach and is more costly (as much as 4:1) than putting their data into deep-or cold-storage vs. a short-term environment (a temporary storage bucket) that is easily accessible.

Fig. 1: Example of sharing storage services amongst varying cloud providers and for multiple purposessome on-prem, some in the cloud. Concept is courtesy of Spectra Logic.

Fig. 1 shows a hybrid managed solution with local cache, multiple cloud storage providers, and local/on-prem primary archive serviced by an object-based storage bucket manager. The concept allows migration, protection, and even retention of existing storage subsystems.

CHOICES AND DECISIONS

Picking cloud service providers for your archive is no easy decisioncomparisons in services and costs can be like selecting a gas or electricity provider. Plans change, sometimes often. Signing onto a deep archive becomes a long-term commitment due primarily to the cost of retrieving the data despite the initial upload costs being much lower. If your workflows demand continual data migration, dont pick a deep or cold-storage plan; look at another near-line solution. Be wary of long-term multi-year contractscloud vendors are very competitive, offering advantages that can adjust annually.

The amount of data youre to store will continually grow. Be selective about the types of data stored, the duration you expect to keep that data, and choose wisely as to what is really needed. Your organizations policies may dictate store everything, so know what the legal implications are, if any.

Carefully look at the total cost of ownership (TCO) in the platform, then weigh the long-term vs. short-term model appropriately.

Karl Paulsen is CTO at Diversified and a SMPTE Fellow. He is a frequent contributor to TV Technology, focusing on emerging technologies and workflows for the industry. Contact Karl at kpaulsen@diversifiedus.com.

See the article here:
Uncovering the Long-Term Cloud Archive Equation - TV Technology

Is Cloud Computing the Answer to Genomics Big Data… – Labiotech.eu

The success of the genomics industry has led to generation of huge amounts of sequence data. If put to good use, this information has the potential to revolutionize medicine, but the expense of the high-powered computers needed to achieve this is making full exploitation of the data difficult. Could cloud computing be the answer?

Over the last decade, genomics has become the backbone of drug discovery. It has allowed scientists to develop more targeted therapies, boosting the chances of successful clinical trials. In 2018 alone, over 40% of FDA-approved drugs had the capacity for being personalized to patients, largely based on genomics data. As the percentage has doubled over the past four years, this trend is unlikely to slow down anytime soon.

The ever-increasing use of genomics in the realm of drug discovery and personalized treatments can be traced back to two significant developments over the past decade: plunging sequencing costs and, consequently, an explosion of data.

As sequencing technologies are constantly evolving and being optimized, the cost of sequencing a genome has plummeted. The first sequenced genome, part of the Human Genome Project, cost 2.4B and took around 13 years to complete. Fast forward to today, and you can get your genome sequenced in less than a day for under 900.

According to the Global Alliance for Genomics and Health, more than 100 million genomes will have been sequenced in a healthcare setting by 2025. Most of these genomes will be sequenced as part of large-scale genomic projects stemming from both big pharma and national population genomics initiatives. These efforts are already garnering immense quantities of data that are only likely to increase over time. With the right analysis and interpretation, this information could push precision medicine into a new golden age.

Are we ready to deal with enormous quantities of data?

Genomics is now considered a legitimate big data field just one whole human genome sequence produces approximately 200 gigabytes of raw data. If we manage to sequence 100M genomes by 2025 we will have accumulated over 20B gigabytes of raw data. The massive amount of data can partially be managed through data compression technologies, with companies such as Petagene, but that doesnt solve the whole problem.

Whats more, sequencing is futile unless each genome is thoroughly analyzed to achieve meaningful scientific insights. Genomics data analysis normally generates an additional 100 gigabytes of data per genome for downstream analysis, and requires massive computing power supported by large computer clusters a feat that is economically unfeasible for the majority of companies and institutions.

Researchers working with large genomics datasets have been searching for other solutions, because relying solely on such high-performance computers (HPC) for data analysis is economically out of the question for many. Large servers require exorbitant amounts of capital upfront and incur significant maintenance overheads. Not to mention, specialized and high-level hardware, such as graphics processing units, require constant upgrades to remain performant.

Furthermore, as most HPCs have different configurations, ranging from technical specs to required software, the reproducibility of genomics analyses across different infrastructures is not a trivial feat.

Cloud computing: a data solution for small companies

Cloud computing has emerged as a viable way to analyze large datasets fast without having to worry about maintaining and upgrading servers. Simply put, Cloud computing is a pay-as-you-go model allowing you to rent computational power and storage. and its pervasive across many different sectors.

According to Univa the industrial leader in workload scheduling in the cloud and HPC more than 90% of organizations requiring high performance computing capacity have moved, or are looking into moving to the cloud. Although this is not specific for companies in the life sciences, Gary Tyreman Univas CEO suggests that pharmaceutical companies are ahead of the market in terms of adoption.

The cloud offers flexibility, an alluring characteristic for small life science companies that may not have the capital on-hand to commit to large upfront expenses for IT infrastructure: HPC costs can make or break any company. As a consequence, many opt to test their product in the cloud first, and if numbers look profitable, they can then invest in an in-house HPC solution.

The inherent elasticity of cloud resources enables companies to scale their computational resources in relation to the amount of genomic data that they need to analyze. Unlike with in-house HPCs, this means that there is no risk money will be wasted on idle computational resources.

Elasticity also extends to storage: data can be downloaded directly to the cloud and removed once the analyses are finished, with many protocols and best practices in place to ensure data protection. Cloud resources are allocated in virtualized slices called instances. Each instance hardware and software is pre-configured according to the users demand, ensuring reproducibility.

Will Jones, CTO of Sano Genetics, a startup based in Cambridge, UK, offering consumer genetic tests with support for study recruitment, believes the cloud is the future of drug discovery. The company carries out large data analyses for researchers using its services in the cloud.

In a partnership between Sano Genetics and another Cambridge-based biotech, Joness team used the cloud to complete the study at a tenth of the cost and in a fraction of the time it would have taken with alternative solutions.

Besides economic efficiency, Jones says that moving operations to the cloud has provided Sano Genetics with an additional security layer, as the leading cloud providers have developed best practices and tools to ensure data protection.

Why isnt cloud computing more mainstream in genomics?

Despite all of the positives of cloud computing, we havent seen a global adoption of the cloud in the genomics sector yet.

Medley Genomics a US-based startup using genomics to improve diagnosis and treatment of complex heterogeneous diseases, such as cancer moved all company operations to the cloud in 2019 in a partnership with London-based Lifebit.

Having spent more than 25 years at the interface between genomics and medicine, Patrice Milos, CEO and co-founder of Medley Genomics, recognized that cloud uptake has been slow in the field of drug discovery, as the cloud has several limitations that are preventing its widespread adoption.

For starters, long-term cloud storage is more expensive than the HPC counterpart: cloud solutions charge per month per gigabyte, whereas with HPC, once youve upgraded your storage disk, you have no additional costs. The same goes for computing costs: while the cloud offers elasticity, Univas CEO Tyreman says that the computation cost of a single analysis is five times more expensive compared to an HPC solution in many scenarios. However, as cloud technologies continue to progress and the market becomes increasingly more competitive among providers, the ongoing cloud war will likely bring prices down.

Furthermore, in the world of drug discovery, privacy and data safety are paramount. While cloud providers have developed protocols to ensure the data is safe, some risks still exist, for example, when moving the data. Therefore, large pharmaceutical companies prefer internal solutions to minimize these risks.

According to Milos, privacy remains the main obstacle for pharmaceutical companies to fully embrace the cloud, while the cost to move operations away from HPCs is no longer a barrier. While risks will always exist to a certain extent, Milos highlighted that the cloud allows seamless collaboration and reproducibility, both of which are essential for research and drug discovery.

Current players in the cloud genomics space

Cloud computing is a booming business and 86% of cloud customers rely on three main providers: AWS (Amazon), Azure (Microsoft) and Google Cloud. Although the three giants currently control the market, many other providers exist, offering more specialized commercial and academic services.

Emerging companies are now leveraging the technology offered by cloud providers to offer bioinformatics solutions in the cloud, such as London-based Lifebit, whose technology allows users to run any bioinformatics analyses through any cloud provider with a user-friendly interface effectively democratizing bioinformatics for all researchers, regardless of skill set.

Federation is a concept from computing now used in the field of genomics. It allows separate computers in different networks to work together to perform secure analysis without having to expose private data to others, effectively removing any potential security issues.

The amount of data organizations are now dealing with has become absolutely unmanageable with traditional technologies, and is too big to even think about moving, explained Maria Chatzou Dunford, Lifebits CEO and co-founder.

When data is moved, you increase the chances of having it be intercepted by third-parties, essentially putting it at significant risk. Data federation is the only way around this unnecessary data storage and duplication costs, and painstakingly slow data transfers become a thing of the past.

Getting ready for the genomics revolution

Its no secret that genomics is key to enabling personalized medicine and advancing drug discovery. We are now seeing a genomics revolution where we have an unprecedented amount of data ready to be analyzed.

The challenge now is: are we ready for it? To be analyzed, big data requires massive computation power, effectively becoming an entry barrier for most small organizations. Cloud computing provides an alternative to scale analyses, while at the same time, facilitating reproducibility and collaboration

While the cost and security limitations of cloud computing are preventing companies from fully embracing the cloud, these drawbacks are technical and are expected to be resolved within the next few years.

Many believe that the benefits of the cloud heavily outweigh its limitations. With major tech giants competing to offer the best cloud solutions a market valued at $340 billion by 2024 we might be able to expect a drastic reduction in costs. While some privacy concerns may still exist, leading genomics organizations are developing new tools and technologies to protect genomic data.

Taken as a whole, it is likely that the cloud will be increasingly important in accelerating drug discovery and personalized medicine. According to Univas Tyreman, it will take around 1015 years to see the accelerated transition from HPC to cloud, as large organizations are often conservative in embracing novel approaches.

Distributed big data is the number one overwhelming challenge for life sciences today, the major obstacle impeding progress for precision medicine, Chatzou Dunford concluded.

The cloud and associated technologies are already powering intelligent data-driven insights, accelerating research, discovery and novel therapies. I have no doubt we are on the cusp of a genomics revolution.

Filippo Abbondanza is a PhD candidate in Human Genomics at the University of St Andrews in the UK. While doing his PhD, he is doing an internship at Lifebit and is working as marketing assistant at Global Biotech Revolution, a not-for-profit company growing the next generation of biotech leaders. When not working, he posts news on LinkedIn and Twitter.

Images via E. Resko, Lifebit and Shutterstock

See original here:
Is Cloud Computing the Answer to Genomics Big Data... - Labiotech.eu

HPC In 2020: Acquisitions And Mergers As The New Normal – The Next Platform

After a decade of vendor consolidation that saw some of the worlds biggest IT firms acquire first-class HPC providers such as SGI, Cray, and Sun Microsystems, as well as smaller players like Penguin Computing, WhamCloud, Appro, and Isilon, it is natural to wonder who is next. Or maybe, more to the point, who is left?

As it turns out, there are still plenty of companies, large and small, that can fill critical holes in the product portfolios of HPC providers, or those who want to be HPC players. These niche acquisitions will be especially important to these same providers as they expand into HPC-adjacent markets such as artificial intelligence, data analytics and edge computing.

One company that can play into all of these markets is FPGA-maker Xilinx. Since Intel acquired Altera in 2015, Xilinx is the only standalone company of any size that makes reconfigurable logic devices. Give that, the natural buyer for Xilinx would be AMD, Intels arch-nemesis. AMD, of course, already has a highly competitive lineup of CPUs and GPUs to challenge its much larger rival, and the addition of an FPGA portfolio would open a third front. It would also provide AMD entry into a whole array of new application markets where FPGAs operate: ASIC prototyping, IoT, embedded aerospace/automotive, 5G communications, AI inference, database acceleration, and computational storage, to name a few.

The only problem is that Xilinxs current market cap of around $25 billion, or about half the current market cap of AMD. And if youre wondering about AMDs piggy bank, the chipmaker has $1.2 billion cash on hand as of September 2019. Which means any deal would probably take the form of a merger rather than a straight acquisition. Theres nothing wrong with that, but a merger is a more complex decision and has greater ramifications for both parties. Thats why the rumors of a Xilinx acquisition have tended to center on larger semiconductor manufacturers that might be looking to diversify their offerings, like Broadcom or Qualcomm. Those acquisitions wouldnt offer the HPC and AI technology synergies that AMD could provide, but they would likely be easier to execute.

Another area that continues to be ripe for acquisitions is the storage market. In HPC, Panasas and DataDirect Networks stand alone well, stand together as the two HPC specialists left in the market. And of those two, the more modest-sized Panasas would be easier to swallow. But most HPC OEMs, including the biggies like Hewlett Packard Enterprise, Dell Technologies, and Lenovo already have their own HPC storage and file system offerings of one sort or another, although Lenovo is probably most deficient in this regard. For what its worth though, Panasas, which has been around since 1999, has never attracted the kind of suitor willing to fold the companys rather specialized parallel file system technologies into its own product portfolio. In all honesty, we dont expect that to change.

The real storage action in the coming years in HPC, as well as in the enterprise and the cloud, is going to be in the software defined space, where companies like WekaIO, VAST Data, Excelero, and DataCore Software have built products that can virtualize all sorts of hardware. Thats because the way storage is being used and deployed in the datacenter these days is being transformed by cheaper capacity (disks) and cheaper IOPS (NVM-Express and other SSD devices), the availability of cloud storage, and the inverse trends of disaggregation and hyperconvergence.

As we noted last July: While there are plenty of NAS and SAN appliances being sold into the enterprise to support legacy applications, modern storage tends to be either disaggregated with compute and storage broken free of each other at the hardware level but glued together on the fly with software to look local or hyperconverged with the compute and block storage virtualized and running on the same physical server clusters and atop the same server virtualization hypervisors.

Any of the aforementioned SDS companies, along with others, may find themselves courted by OEMs and storage-makers, and even cloud providers. DDN has been busy in that regard, having acquired software-defined storage maker Nexenta in May 2019. We expect to see more of such deals in the coming years. Besides DDN, other storage companies like NetApp should be looking hard at bringing more SDS in-house. The big cloud providers Amazon, Microsoft, Google, and so on will also be making some big investments in SDS technologies, even if theyre not buying such companies outright.

One market that is nowhere near the consolidation stage is quantum computing. However, that doesnt mean companies wont be looking to acquire some promising startups in this area, even at this early stage. While major tech firms such as IBM, Google, Intel, Fujitsu, Microsoft, and Baidu have already invested a lot on in-house development and are busy selecting technology partners, other companies have taken a more wait-and-see approach.

In the latter category, one that particularly stands out is HPE. In this case, the company is more focused on near-term R&D, like memristors or other memory-centric technologies. While there may be some logic in letting other companies spend their money figuring out the most promising approaches for quantum computing, and then swoop in and copy (or buy) whatever technology is most viable, there is also the risk of being left behind. Thats something HPE cannot afford.

That said, HPE has recently invested in IonQ, a promising quantum computing startup that has built workable prototype using ion trap technology. The investment was provided via Pathfinder, HPEs investment arm. In an internal blog post on the subject penned by Abhishek Shukla, managing director of global venture investments, and Ray Beausoleil, Senior Fellow of large scale integrated photonics, the authors extol the virtues of IonQs technical approach:

IonQs technology has already surpassed all other quantum computers now available, demonstrating the largest number of usable qubits in the market. Its gate fidelity, which measures the accuracy of logical operations, is greater than 98 percent for both one-qubit and two-qubit operations, meaning it can handle longer calculations than other commercial quantum computers. We believe IonQs qubits and methodology are of such high quality, they will be able to scale to 100 qubits (and 10,000 gate operations) without needing any error correction.

As far as we can tell, HPE has no plans to acquire the company (and it shares investment in the startup with other companies, including Amazon, Google, and Samsung, among others). But if HPE is truly convinced IonQ is the path forward, it would make sense to pull the acquisition trigger sooner rather than later.

We have no illusions that any of this comes to pass in 2020 or ever. As logical as the deals we have suggested seem to us, the world of acquisitions and mergers is a lot more mysterious and counterintuitive than wed like to admit (cases in point: Intel buying Whamcloud or essentially buying Cloudera through such heavy investment). More certain is the fact that these deals will continue to reshape the HPC vendor landscape in the coming decade as companies go after new markets and consolidate their hold on old ones. If anything, the number of businesses bought and sold will increase as high performance computing, driven by AI and analytics, will extend into more application domains. Or as the Greeks put it more succinctly, the only constant is change.

See the original post here:
HPC In 2020: Acquisitions And Mergers As The New Normal - The Next Platform

Amazon will ask a court to block Microsoft from working on a $10 billion cloud computing contract – Q13 News Seattle

Amazon will ask a federal court to temporarily block Microsoft from working on a$10 billion cloud computing contractfor the military, according to a court filing Monday.

Microsoft is scheduled to begin its work on the contract on Feb. 11. But Amazons cloud computing division will seek a preliminary injunction to prevent the issuance of substantive task orders under the contract, the filing said. Amazons request will be submitted by Jan. 24.

The request is part of Amazons ongoing challenge to the Trump administration over the way the contract was awarded, which it argues was influenced by President Donald Trumps dislike of Amazon CEO Jeff Bezos and the Washington Post, which Bezos owns. Amazon was widely believed to be the front-runner to win the Pentagons business, before Trump vowed to take a strong look at the deal. It lost the contract to Microsofts Azure cloud business in October.

The contract called Joint Enterprise Defense Infrastructure, or JEDI involves providing cloud storage of sensitive military data and technology, such as artificial intelligence, to the Department of Defense, and could result in revenue of up to $10 billion over 10 years.

After Trumps remarks, the Defense Department launched a review of the JEDI contract. Defense Secretary Mark Esper recused himself from the review less than a month before the contract was awarded, citing his sons employment at IBM which had also briefly been in the running for JEDI.

The Pentagon said it chose Microsoft because the company would help improve the speed and effectiveness with which we develop and deploy modernized technical capabilities to our men and women in uniform.

Late last year, Amazon filed a suit with the US Court of Federal Claims contesting the decision.

Microsoft declined to comment Monday night.

Read the original:
Amazon will ask a court to block Microsoft from working on a $10 billion cloud computing contract - Q13 News Seattle

Business Cloud Storage Market Outlook: Heading to the Clouds – The Market Journal

Latest released 2020 version of market study on Global Business Cloud Storage Market with 122+ market data Tables, Pie Chat, Graphs & Figures spread through Pages and easy to understand in depth analysis. Global Business Cloud Storage Market by Type (Less than 100GB, 100GB to 1TB, 1TB to 5TB, More than 5TB), by End-Users/Application (Application I, Application II, Application III), Industry Size, Organizations, and Region Forecast and outlook to 2026. At present, the market has established its presence. The Research presents a complete assessment of the Market and contains a future trend, current growth factors, focused opinions, details, and industry certified market data.

Get Access to sample pages @https://www.htfmarketreport.com/sample-report/2425157-global-business-cloud-storage-market

1. Who is poised to win in 2020

Looking out to 2020, its expected to be a big year for Global Business Cloud Storage Market in terms of growth. As more companies move some or all of their applications, emerging players are poised to benefit. Some of the players from the overall coverage being profiled were Zoolz, OpenDrive, JustCloud, MozyPro, Egnyte, CrashPlan, Dropbox, Carbonite, OpenText & Box. With the Business Cloud Storage market forecast to grow YY% in 2020 and with X-X-X-X expected to be a big beneficiary, it is better positioned than Z-Z-Z-Z for 2020.

2. A wave of New Business Segments comes crashing in

According to HTF MI, key business segments sales will cross the $$ mark in 2020, signalling changing consumer preferences. Unlike classified segments popular in the industry i.e. by Type (Less than 100GB, 100GB to 1TB, 1TB to 5TB, More than 5TB), by End-Users/Application (Application I, Application II, Application III), the latest 2020 version is further broken down / narrowed to highlight new emerging twist of the industry.

Check for more detail, Enquire @https://www.htfmarketreport.com/enquiry-before-buy/2425157-global-business-cloud-storage-market

3. How are the Business Cloud Storage companies responding?

With Latest earning release, Industry Players disclosing its plans to expand its model for bringing new offerings to the market faster and with more precision. Market Makers and End Consumers are getting a glimpse of this process with new products henceforth study is given special attention by demand side analysis as well to better understand consumer behaviour and changing preferences.

With the large investments from giants are putting new flavour in market, it remains to be seen how effective their new product lines will be and just how much growth it would witness for them.

Be the first to tap the potential that Global Business Cloud Storage market is holding in it. Uncover the Gaps and Opportunities to derive most useful insights from our research publication to outpace market.Buy this research report @https://www.htfmarketreport.com/buy-now?format=1&report=2425157

Research objectives

to study and analyse the Global Business Cloud Storage Market size by key regions/countries, product type and application, history data from 2014 to 2018, and forecast to 2026. to understand the structure of Business Cloud Storage Market by identifying its various sub segments. Focuses on the key Global Business Cloud Storage Market players, to define, describe and analyse the value, market share, market competition landscape, SWOT analysis and development plans in next few years. to analyse the Business Cloud Storage Market with respect to individual growth trends, future prospects, and their contribution to the total market. to share detailed information about the key factors influencing the growth of the market (growth potential, opportunities, drivers, industry-specific challenges and risks). To project the size of Business Cloud Storage Market, with respect to key regions, type and applications. To analyse competitive developments such as expansions, agreements, new product launches and acquisitions in the market.

Read Detailed Index of full Research Study at @https://www.htfmarketreport.com/reports/2425157-global-business-cloud-storage-market

Thanks for reading this article; you can also get individual chapter wise section or region wise report version like North America, Western / Eastern Europe or Southeast Asia.

About Author:

HTF Market Report is a wholly owned brand of HTF market Intelligence Consulting Private Limited. HTF Market Report global research and market intelligence consulting organization is uniquely positioned to not only identify growth opportunities but to also empower and inspire you to create visionary growth strategies for futures, enabled by our extraordinary depth and breadth of thought leadership, research, tools, events and experience that assist you for making goals into a reality. Our understanding of the interplay between industry convergence, Mega Trends, technologies and market trends provides our clients with new business models and expansion opportunities. We are focused on identifying the Accurate Forecast in every industry we cover so our clients can reap the benefits of being early market entrants and can accomplish their Goals & Objectives.

Contact US :Craig Francis (PR & Marketing Manager)HTF Market Intelligence Consulting Private LimitedUnit No. 429, Parsonage Road Edison, NJNew Jersey USA 08837Phone: +1 (206) 317 1218sales@htfmarketreport.comConnect with us atLinkedIn|Facebook|Twitter

Go here to read the rest:
Business Cloud Storage Market Outlook: Heading to the Clouds - The Market Journal

Jussie Smollett is probably toast now that Google is handing his data to the special prosecutor – Washington Examiner

Actor Jussie Smollett, aka "the gay Tupac," almost certainly lied about the assault he claims he experienced last year. A new order from a judge in Chicago will likely prove it once and for all.

Cook County Judge Michael Toomin, who appointed a special prosecutor to look into the case, is requiring Google to turn over a year's worth of Smollett's emails, location data, and messages. He's also ordered the same for Smollett's manager, a witness to the alleged hate crime, wherein Smollett says he was jumped early one morning by two men who confronted him with racist, anti-gay, and pro-Trump comments before beating him.

The Chicago Tribune reported last week that the court orders include that Google hand over "drafted and deleted messages; any files in their Google Drive cloud storage services; any Google Voice texts, calls and contacts; search and web browsing history."

That's a lot of data. A lot.

Chicago police last year already believed that they had nailed down a timeline proving Smollett was lying about the attack and that he had staged it himself in conjunction with two acquaintances. A grand jury agreed with the police, slapping Smollett with a dozen charges.

But if police were so confident by simply using Smollett's highly redacted phone records, surveillance footage, and the confessions of the two brothers, imagine what this new treasure trove of information will offer.

Google location data is alarmingly specific, and it's recorded minute by minute. That is, unless Smollett turned off location services on his mobile devices. And even then, there are times that Google is still monitoring and recording.

Even without that information, though, there's endless information to be learned from Google searches Smollett conducted, websites he visited, and any messages he might have sent around the time of the incident.

Smollett still says he's innocent. We'll see.

The only reason there's a special prosecutor looking into his case is because the Cook County prosecutor abruptly determined that it wasn't pursuing it, despite the overwhelming evidence that Smollett had faked a hate crime.

The special prosecutor appointment was bad news for Smollett. And this new court order is really bad news him. Google it.

See the rest here:
Jussie Smollett is probably toast now that Google is handing his data to the special prosecutor - Washington Examiner

The Easiest Way to Switch from Windows 7 to Linux – Lifehacker

Welcome to the last day of Windows 7the last day Microsoft is giving out security updates for the antiquated operating system, that is. While you have plenty of options for upgrading Windows 7, and even a hack that might be able to extend your updates for years, one of the best things you can do if you dont want to make the jump to Windows 10 is to take a 90-degree turn toward Linux.

Yes, Linux. Dont be scared. While your first thought is probably, thats too complicated for me, hear me out. There are a number of Linux distributions that look and feel like the Windows youre already familiar with. You wont find yourself sitting in front of a command prompt, wondering what to do next, unless thats the kind of experience you want. Otherwise, Linux isnt terrifying in the slightest.

If youre sticking with Windows 7 because of a specific reasonapps that only work on that version of the OS and nothing elsewe even have a workaround for that, too: virtualizing Windows 7 so you can still access it in a safe, as-you-need-it fashion (assuming your system can handle it).

Stick with us, and well show you just how easy it is to switch to Linux and all the great apps that couldnt be any easier to download and install in the OS. (We do love package managers.)

For the sake of keeping this article under a million words, Im going to assume that youve already saved your critical data and everything else you need from your existing Windows 7 installation. (You should be backing up your system all the time anyway, so this shouldnt be a surprise).

If youre nervous about switching over, you can start by creating a live CD (or live USB) of the Linux distribution well be using, Linux Mint. In fact, youll have to do this anyway to install it, so might as well get it out of the way now. By booting to a Live CD when your computer starts instead of Windows 7, youll be able to explore what its like to use Linux Mint as if you had actually installed it on your system. Nothing you do in the OS persists between rebootsits all temporarybut this at least gives you the ability to try out this Linux distribution and see if you like it before you fully commit.

For most people, I recommend creating a list of all the Windows apps youve installed and saving that to a cloud-storage account somewhere, along with any other critical data that fits (your documents, for example). Upload your photos to a cloud-storage service as welleither Google Photos, if thats sufficient, or an online storage server if you need to preserve your shots in their original quality. Take all the time you need to do this part, because you only get one shot at it (unless youve taken the secondary step of using a service like Backblaze to automatically save all your stuff or have cloned your entire drive elsewhere).

Finally, write down your Windows 7 product key. If you have no idea where or what that is, use the ProduKey utility to find it.

Double-click the (blatantly obvious) Install Linux Mint icon on your desktop, which should pop up this first screen:

Continue forward. Eventually, youll see a screen that looks something like this, which is what I like to call the point of no return in this process.

Sure, you could create separate partitions on your primary drive Linux Mint and Windows 7. Its a great alternative if your system isnt strong enough to handle a virtual machine, but youd like the ability to use Windows 7 for tasks you cant otherwise complete within Linux. Im not going to get into it in this guide, but know that dual-booting both operating systems is an option. In fact, Linux Mint makes this easy for you, in a screen I cant show you because I installed this OS on a blank virtual machine (for conveniences sake). Yes, Ill soon be running a VM inside a VM. Insert Hans Zimmers Inception score here.

Once youve made your choiceI stuck with the default optionsyoull soon be asked to create a user name and password. Standard stuff. After that, Linux Mint will begin the process of wiping your drive and installing the OS. Heres hoping you backed up your critical information from Windows 7, like I advised earlier.

When Linux Mint starts back up again, you wont need your CD or USB key anymore, and your screen should look like this:

Ah, the fresh scent of a new operating system. Tempted as you might be to start re-downloading your files onto your new OSfrom the cloud, from an external storage device, or from wherever else it is you stored your Windows 7 dataresist the urge just yet. Pull up Driver Manager to see if theres anything new for your system that you need to update or install, and then do the same with Update Manager. Restart your computer as needed.

Assuming your system can handle it, virtualizing Windows 7 on your Linux desktop is a great way to keep the OS hanging around for those moments when there simply isnt a Linux alternative for whatever it is you need to do. And to do that, well be using VirtualBox to run an instance of Windows 7 directly within Linux Mint. It sounds complicated, but its not.

To get started, pull up Linux Mints software manager. You should see a listing for VirtualBox on the front page; if not, search for it.

Installing the app is easyjust click on the button and do whatever the prompts request of you.

Launch VirtualBox, and youll see a boring and blank interface. Fix that by clicking the New button, picking Windows 7 as your operating system, and giving your virtual machine a useful name.

Youll be asked how much memory you want to allocate to your virtual machine. Linux Mint needs about 2GB, at minimum, to run smoothly, and Windows 7 should get at least 1GB of memory if youre running a 32-bit version of the operating system and 2GB if youre bumping up to a 64-bit version.

Next, youll be asked to create a hard disk for Windows 7. Again, youll need slightly more space for a 64-bit version of the OS (20GB minimum) than a 32-bit version of the OS (16GB minimum). Youll also want to think about how much other stuff youll want to stuff on your Windows 7 instancedata, apps, et ceteraand decide accordingly.

If you choose poorly, dont freak out. You can embiggen this virtual hard drive later. Life goes on.

Once you click on create, youre halfway done. Youll want to click on your virtual machine in the listing, and then click on the Settings icon. Theres plenty to play with in here, but I recommend checking out two key sections: System, which will let you assign additional processors to your virtual machine if you want to boost its speed a bit, and the ever-critical Storage.

Once you get to Storage, youre going to want to take a moment to pull up a web browser in Linux Mint and navigate over to Microsofts site, where youll be able to download a full disk image (.ISO) of Windows 7. Yes, youll need your product key for this.

Once youve downloaded that .ISO to Linux Mint, head back to VirtualBoxs storage settings. Click on the disc icon under Storage Devices, and then click on the similar-looking disc icon on the right side of the window. Select Choose Virtual Optical Disk File, and go find that Windows 7 .ISO.

Theres more you can play with in Settingslike sharing a folder from Linux Mint (like Downloads) that you can then pull up in Windows 7, if you wantbut youve now checked off all the basic requirements. You can fire up your virtual machine via the big green start arrow on the main screen and begin the process of installing and configuring your new Windows 7 installation.

After that, its back to the basics: Make sure youve grabbed any updates you need from Windows Update and have installed whatever apps you plan to use. My advice is to resist getting hooked back into Windows 7 once again. The OS might seem more familiar than Linux Mint, initially, but if you can accomplish a similar task in Linux that you could in Windows, opt for the latter. Your performance will be better, first of all, and youll be doing whatever it is youre doing in a more secure operating system.

If youre still a little, what do I do next? whenever you launch Linux Mint, I understand. Lets get you up to speed with some useful apps. Were I looking to use Linux Mint like I use Windows, Id hit up the aforementioned Software Manager and grab these apps:

Of course, there are plenty of others worth installing, too (Steam? Discord?). These are the basics, but Linux Mints Software Manager makes it incredibly easy to find and install more.

Read the rest here:
The Easiest Way to Switch from Windows 7 to Linux - Lifehacker

Join us live online today: Find out how to store and manage data in the hybrid-cloud era to boost your business – The Register

Webcast You know the story: your users are creating data faster than ever before.

But ask yourself: is your information being stored effectively? Can your users get hold of the data they need, and use it when and where they want?

If your answer is "no," or even just "not always," today's webcast, brought to you by NetApp, is for you.

Processing terabytes upon terabytes of business data to extract valuable insights and trends can and should take place anywhere: in your data center, computer room, on your desk, on the road, and in the cloud.

Managing your data across many of these different types of systems, scattered over multiple locations and jurisdictions, can be difficult, though. To keep costs and security under control, you need to have a robust, reliable data platform to handle the ingestion and creation of information, cope with the active use of this data, and manage its long-term storage through to its end of life. And this all has to work without constant skilled supervision.

So, what can you do? Join us today at 3pm GMT to find out as Tony Lock of Freeform Dynamics, and Adrian Cooper of NetApp, discuss what options are available to build a robust unified data platform that operates across hybrid cloud environments.

If you need to improve how you manage your data, or your users could do with some help to get the most out of all the data you hold, please join this webcast, and get involved.

Click right here to sign up now.

Sponsored: Detecting cyber attacks as a small to medium business

Follow this link:
Join us live online today: Find out how to store and manage data in the hybrid-cloud era to boost your business - The Register

VIEWPOINT: Refocusing as a Digital-First Publication – Georgetown University The Hoya

At our first onboarding event for The Hoya, we remember how a well-known news media professional who then consulted for The Hoya informed us that the only thing we need to know about a journalists job is that it is always changing. This was especially evident in 2016.

The Hoya faced headwinds heading into its 98th year. When we started our roles as editor-in-chief and general manager, the newspaper was on a precipice. Print readership and thus advertising revenue had declined year after year. The continued focus on print, however, had disconnected the paper from the majority of our readers, who accessed our content online. The Hoya was also in serious need of investment, though a shrinking topline forced the organization to rely on external funding sources, making budgeting for investment virtually impossible. Against this backdrop, our leadership team engaged in a monthslong process to identify what change was necessary if the paper was going to survive the coming decades.

As we kicked off an assessment to understand The Hoyas future, it quickly became clear that we needed to become a digital-first publication. A pivot to an online daily format with a weekly print edition would allow us to leverage our deep content expertise, diverse and inclusive culture, and entrepreneurial spirit to take on a quickly evolving media landscape.

The leadership team put countless hours into a framework that would eventually form the foundation of this transition. We solicited input from experts, our membership and our readers, knowing much was at stake no less than the future of an organization that had been integral to our Georgetown University experience. Our team set ambitious goals, but they were exactly what was needed to take The Hoya into the digital age.

In the spring of 2017, having consulted with our most important organizational stakeholders and after 30 years of printing twice weekly, our leadership and the wider membership of The Hoya voted overwhelmingly in favor of transitioning to an online daily format with a weekly print edition. It was time to get to work.

Our North Star was always our readers. Through the ups and downs of the organizational overhaul, we evaluated our work by how it would improve our content and further our relationship with our readers.

Daily online publication necessitated a drastic increase in the volume of our content, and as we considered ways to increase our production capacity, we had to redefine our relationship with students, faculty and members of the community and understand where there were gaps in our coverage. We also understood that online articles have always been met with greater scrutiny simply by nature of the medium, according to a study by Kantar. For our writers and editors, this skepticism meant increased diligence to ensure the factual accuracy of every article. It also meant a renewed emphasis on community outreach in pursuit of balanced representation.

Within the first week of our term, we overhauled our entire online engagement strategy. Our readers had been finding us through Facebook, Twitter and Instagram. Correspondingly, we expanded our multimedia and social media teams and created content that would likely be consumed in a news feed on a mobile device.

To succeed as a digital-first publication, we also needed to become a 21st century newsroom. Gone were the days of exchanging article edits physically through USBs and undependable hard drives. Cutting one of our print issues allowed us to reallocate resources to building cloud storage systems, introducing online project management and communication platforms, and replace a technology infrastructure first put together in the late 1990s.

Perhaps the most important transformation of all, however, was cultural. Becoming a daily operation required our staffers to spend more hours in the office, often late into the night. This time was not spent on coursework, part-time jobs or social lives, but on publishing stories and giving a voice to people central to the Georgetown community.

When someone is first hired by The Hoya, they are told that the quality of our product is only as strong as the quality of our staffers; our membership is our strongest asset. Therefore, it was necessary to invest heavily in our staffers and their well-being. We expanded staff professional development opportunities, formalized philanthropy efforts, built a strong network of Hoya alumni and piloted a health and wellness initiative.

Those staffers who have worked with us know our bold predictions are few and far between, though, as we ruminate on our past experience with The Hoya, we feel confident in one thing: Though The Hoya will undoubtedly continue to evolve in its next 100 years, there is no better training ground to prepare todays student journalists for tomorrows newsroom.

Toby Hung (COL 18) is a former editor-in-chief of The Hoya. Daniel Almeida (MSB 18) is a former general manager of The Hoya.

Visit link:
VIEWPOINT: Refocusing as a Digital-First Publication - Georgetown University The Hoya

Cloud Computing: Current Top Trends and Technologies – Datamation

Register for this live video webinar - Tuesday, January 21, 11 AM PT. Ask a top cloud expert - get your questions answered by an industry leader.

Cloud computing has grown from emerging disrupter to the very foundation of today's enterprise IT, and yet the pace of change in the cloud sector shows no signs of lagging.

Hybrid cloud has given way to multicloud -- or is that just hype? The concept of "cloud native" is now au courant, offering its own myriad challenges. Emerging technologies from microservices to kubernetes to edge computing are prompting big shifts.

These many and constant new developments beg the question: what do I need to know to truly be current with cloud in 2020?

To provide insight, Ill speak with a leading cloud expert, Bernard Golden. Golden had held number top tech positions; most recently he was Vice President, Cloud Strategy, Capitol One. Wired magazine dubbed him "one of the ten most influential people in Cloud Computing." He's the author of Amazon Web Services for Dummies, a bestselling cloud computing book.

Register for this live video webinar - Tuesday, January 21, 11 AM PT

In this webinar you will learn:

Bernard Golden, top cloud computing expert

James Maguire, Managing Editor, Datamation moderator

Register for this live video webinar - Tuesday, January 21, 11 AM PT

Get your cloud questions answered by leading expert.

Go here to see the original:
Cloud Computing: Current Top Trends and Technologies - Datamation