Category Archives: Cloud Storage
Cloud bursting: What it is and what its good for – ComputerWeekly.com
One of the key benefits of the cloud is its flexible, or elastic, nature.
Organisations can increase compute resources and storage capacity when they need it with little more than a web browser and a credit card. And, if needs change, they can reduce capacity and cost almost as easily.
This, however, does require applications and workflows to operate natively on public cloud infrastructure. And some organisations are either not ready to move all their systems to the cloud or, for regulatory, security or operational reasons, are unable to.
But hybrid architectures provide a way to harness the flexibility of the cloud and tap into its ability to scale. Firms might prefer or need to keep a base level of IT capacity on-premise, or even keep most of their workloads in-house.
But they still want an affordable and flexible way to deal with peak demand. This is where cloud bursting comes in.
Cloud bursting allows firms to take advantage of the clouds almost limitless scale and capacity on a temporary basis, but without the need to move workloads permanently to the public cloud.
Instead, systems are designed to shift to cloud resources as needed, and to switch back to on-premise IT as soon as the peak is over. This could be for weeks, days or even just a few minutes.
This avoids bottlenecks and a poor user experience, and maximises the utilisation of on-premise infrastructure without needing to build in capacity on-premise for predicted peaks. It minimises cloud usage fees, too, because customers only pay for on-demand cloud capacity during the peak, and avoid the energy and other costs associated with underused on-premise hardware.
Often, companies use cloud bursting to cope with peaks such as end-of-year financial analysis or seasonal variations in usage. According to Tony Lock, analyst at Freeform Dynamics, retailers are among the keenest users of cloud bursting as it allows them to manage periods of high demand.
And at a micro-level, cloud bursting can even be used to provide extra capacity for systems such as virtual desktops when more staff are in the office. This could happen during a shift change or temporary busy periods.
Cloud bursting does, however, require an IT architecture that supports it, although this is becoming easier with technologies such as containers.
This benefits businesses in several ways as it can allow short-term requirements to be fulfilled for relatively low cost, with businesses only consuming resources when they need them, saving any unnecessary capital expenditure, says Neil Clark, cloud services director at IT consultancy QuoStar.
Cloud bursting can also offer firms a way to tailor additional capacity to specific workloads, further saving money. Although virtualisation has helped IT departments consolidate servers and storage, it can still be necessary to build additional peak capacity for different applications.
An artificial intelligence training application is likely to have different demands to an enterprise resource planning system, for example. So, chief information officers can use the clouds flexibility to pick the right compute and storage resources to support each application.
On paper, almost any application that faces capacity constraints will benefit from cloud bursting. In practice, those that rely on large volumes of data or data that is tightly controlled for security, privacy or regulatory reasons are harder to burst. Either moving the data takes too long, or it is not permitted.
Bursting also works best with fairly short duration peaks in workload. Although there is no simple rule for this, if you burst for too long, on-demand cloud pricing starts to look expensive compared with fixed-term agreements.
Cloud bursting is best suited to compute-intensive and non-critical workloads that fluctuate in their capacity requirements, such as batch jobs, says Anay Nawathe, a principal consultant at ISG. He adds that workloads running on the edge are also good candidates for bursting.
Workloads that work less well are those with close ties between the application and storage, and those that demand high performance and low latency.
A further challenge with cloud bursting is to ensure consistent quality of service, especially for web applications or public-facing services such as e-commerce.
If users notice a significant degradation of performance during bursting prompted by a spike in demand, it might disrupt their interaction and prompt them to switch to a competitor. Thorough planning and testing is needed to ensure bursting works and performs as hoped.
As a result, cloud bursting is best suited to workloads with regular, short but fairly predictable peaks that are not too demanding in compute, input/output or latency. It is less suited to high-performance or performance-critical applications.
The technology behind cloud bursting is well established. But although containers, public cloud at core and edge and private cloud technologies make it easier, IT departments still need to plan and test to make sure bursting works.
Also, it is easier to burst a single application than a workflow that depends on several suppliers technologies and a mix of compute and storage. Bursting only compute to the cloud is significantly less complicated than bursting compute and data to the cloud, says ISGs Anay Nawathe.
There are commercial decisions, too. There is, according to Adrian Bradley, head of cloud transformation at KMPG, a technical overhead to cloud bursting.
Also, bursting plans that were based on low spot prices for cloud capacity a few years ago might no longer provide the best value as cloud providers try to move customers to regular commitments. That points towards moving whole workloads to the cloud rather than bursting.
Public cloud providers incentivise you to put the whole workload on there permanently, rather than having your predictable workload on premise and then bursting occasionally into the cloud, said Bradley.
This highlights another financial consideration: who sanctions the additional costs to burst?
You first to have plan it, make sure it works, test it, and then have a process that sets out who actually says we should burst, because theres going to be extra charges involved, says Freeform Dynamics Lock. Someone needs the authority to say okay, we need to go into a cloud burst scenario and pay the extra.
Read more here:
Cloud bursting: What it is and what its good for - ComputerWeekly.com
LucidLink, AJA, and Telestream Simplify Workflows for Media & Entertainment Companies to Work from Anywhere, in Tandem – PR Newswire
Provides Direct Access to Data, Media Professionals can Work on High Resolution and High Bit Rate Media Files Over the Internet
SAN FRANCISCO, Sept. 1, 2022 /PRNewswire/ -- LucidLink, a global leader in remote collaboration software for creative teams, announced with AJA Video Systemsand Telestreamthe fastest way to access, organize and view data in one easy workflow. LucidLink Filespaces, an award-winning SaaS-based solution, provides rapid access to data through AJA Diskover Media Edition in conjunction with Telestream GLIM and the Vantage platform. LucidLink serves as a central hub uniting what is typically two disparate systems creating seamless workflows never before feasible for media professionals.
"We are excited to be showing this powerful solution for media professionals at IBC 2022," said Rupert Watson, LucidLink's Director of Alliances and Channels, EMEA. "Pulling together the innovative technologies from AJA, Telestream, and LucidLink into one workflow brings new capabilities for creative teams. Now teams can work in tandem on the same project, giving them the power to truly collaborate and create in real-time, no matter where they are based."
LucidLink, AJA, and Telestream Simplify Workflows for Media & Entertainment Companies to Work from Anywhere, in Tandem
One workflow from three powerful solutionsThe companies have come together to build a powerful solution for media professionals: LucidLink's centralized data repository provides immediate data access combined with AJA Diskover Media Edition, which helps catalog and find the right file while customers use Telestream GLIM to review the data and Vantage for transcoding. The combination offers great speed; LucidLink allows remote workers all over the globe to share and access data with speeds that are normally only reserved for high-speed on-prem storage. AJA Diskover Media Edition on top of LucidLink allows organizations to very rapidly index extremely large file systems stored via LucidLink and gives users visibility to the data no matter where they are located, but without the security risks associated with full, potentially destructive access to the data.
Simply put, these combined technologies allow media professionals to access, organize and view data in one easy workflow.
"The media and entertainment industry is bottlenecked with unwieldy data management across production and post, while remote workflows present additional challenges for managing file locations and metadata on the cloud. AJA Diskover Media Edition removes the hurdles associated with data management, helping industry professionals work more efficiently and make more informed data decisions. Our partnership with LucidLink further simplifies navigating cloud storage, enabling remote teams to collaborate seamlessly via a streamlined workflow akin to using on-premises storage. By pairing Diskover Media Edition with LucidLink, users can move immediately from viewing media and metadata to further production of assets with tools like Telestream's Vantage package," said Nick Rashby, President, AJA Video Systems.
LucidLink Filespaces: The future of cloud storage streaming for teamsLucidLink Filespaces provides an infinitely scalable, centralized repository of data in the cloud that can be immediately accessed from anywhere. LucidLink works on any major operating system. The client can install it on any server, laptop, or workstation and mount as a local volume. As a mount point on a server, LucidLink provides a standard directory structure and instant access to data for all these various services that have never been able to integrate seamlessly. LucidLink enables AJA's Diskover and Telestreams GLIM and Vantage servers, located in different parts of the world, to easily access data and work in concert with each other.
Telestream GLIMhelps creative professionals quickly preview high-quality media files with color accuracy via any web browser without generating a proxy file. With direct access to and the ability to launch the player in AJA Diskover Media Edition, professionals can easily view and validate files located on-premises or in cloud storage as indexed by Diskover Media Edition as associated metadata. All files are shown with SCTE-35 markers, waveform view, and audio metering to ensure compliance standards on-the-fly, for any user, in any location.
Access to Telestream's powerful Vantage platformvia AJA Diskover Media Edition v2.0 gives remote workers access to centralized tools for generating proxy files or transcoding media assets for delivery. Users can now select files and send them to an on-premises or cloud-based Telestream Vantage system to create proxies or transcode the assets, regardless of user, file, or Vantage system location. Coupled with AJA Diskover Media Edition's global index, the plug-in accelerates workflows by eliminating roundtripping between applications and streamlining files.
LucidLink will be at IBC 2022 (Hall 7, Booth B06). For more information or to schedule a briefing with LucidLink at IBC 2022, please contact Clare Plaisted at [emailprotected].
About LucidLinkLucidLink offers an innovative cloud-native file service designed specifically for extensive data access over distance. LucidLink Filespaces provides best-in-class security and high-performance scalability to run file-based workloads on object storage for maximum efficiency and productivity. The service is compatible with Microsoft Azure Blob and any Amazon S3 compatible object storage provider that utilizes the cloud, on-prem, or hybrid storage. It supports all major operating systems, including Linux, Windows, and macOS. Investors include Baseline Ventures, Headline, Adobe, Bright Cap Ventures, Bain Capital Ventures, S28 Capital, and Fathom Capital. LucidLink is privately held and headquartered in San Francisco, California, with offices in Bulgaria, Europe, and Australia.
About AJA Video Systems, Inc.Since 1993, AJA Video has been a leading manufacturer of video interface technologies, converters, digital video recording solutions, and professional cameras, bringing high-quality, cost-effective products to the professional, broadcast, and post-production markets. AJA products are designed and manufactured at our facilities in Grass Valley, California, and sold through an extensive sales channel of resellers and systems integrators around the world. For further information, please see our website at http://www.aja.com.
About AJA Diskover Media EditionAJA Diskover Media Edition is a powerful and essential tool for M&E organizations in the remote work era and for navigating the explosion of data creation that is industry-wide. Creative facilities are dealing with a mix of on-prem storage, cloud storage, long term archival storage, and everyone is working in different locations around the globe yet the need for rapid access to data and media assets still exists. AJA Diskover Media Edition provides organizations with these abilities and much more. For more details, visit: http://www.aja.com/products/aja-diskover-media-edition.
For more information about LucidLink, please contact [emailprotected]. Follow us on Twitter and LinkedIn and visit us at http://www.lucidlink.com.
Contact:Julie O'GradyLucidLink[emailprotected]+1 (650) 269-9989
SOURCE LucidLink
The rest is here:
LucidLink, AJA, and Telestream Simplify Workflows for Media & Entertainment Companies to Work from Anywhere, in Tandem - PR Newswire
Teradata takes on cloud-native rivals with data lakes, MLOps – The Register
Teradata has launched analytics and data lake platforms as it strives to steal the march on so-call cloud-native enterprise data warehouse companies.
With ClearScape Analytics, the data warehousing stalwart has launched 50 new in-database time series and ML functions designed to support end-to-end machine learning pipelines.
The company has also embraced cloud-based data lakes, with a product VantageCloud Lake. Its cloud data warehouse platform, Teradata Vantage, has also been renamed VantageCloud Enterprise.
The last few years have seen a confluence of companies around the lakehouse concept. Despite the dubious moniker, it represents a trend in trying to bring together data warehouse workloads repeated analytics on structured data with data lakes, and semi-structured data repositories for more exploratory analyses.
They still have their work cut out for them embracing developers
From the data lake side, Databricks has announced Databricks SQL Serverless, designed to improve query performance and concurrency of BI and analytics workloads on its data lake. Cloudera similarly promises analytics and data exploration in a single platform.
On the data warehouse side, Snowflake has promoted its usefulness as a data lake with support for Python and unstructured data.
Teradata did make an earlier approach to supporting data lakes with the ability to run Hadoop in its on-prem analytics platform Astor.
With VantageCloud Lake, Teradata promises centralized object storage (AWS S3 initially) offering open data formats, structured and unstructured data and flexible schema.
Teradata previously supported S3 and other cloud storage options since the launch of its cloud-based Vantage platform, but Hillary Ashton, chief product officer at Teradata, told The Register its analytics and data management were now more optimized for S3.
"We support read and write and Enterprise Edition with object store, but it was really optimized for EBS block storage for low latency workloads. With [the new data lake] we have optimized for object storage. That seems subtle, but it's actually a very significant difference. That object store is the primary location for data in the lake edition and it's an auxiliary location in enterprise (data warehouse).
"To say it's optimized for object storage now means that we brought the intelligence of our indexing, and our workload management and brought it down into object storage, which differentiates us from just a typical read, write and to move into object store, and really allows us to bring the IP that we've developed over the years in terms of massive parallel processing and improvements in access time into object store," Ashton said.
In its analytics environment, Teradata has introduced more support for management of the machine learning pipeline. So called Model Ops, the system automates the process of picking the most effective champion and challenger model on a given data set.
"Model Ops allows you to manage that process in an automated fashion so that you can constantly be running champion challenger modeling at scale, which means that you're going to get to better analytic outcomes faster," Ashton said.
While this replicates some of the functionality of H2O.ai, Teradata also partners with the ML specialist. "If you've chosen H2O.ai, you can build your models there and then you can import them directly into Teradata Vantage," she said.
Analyst Tony Baer, principal at dbInsight, noted Teradata had developed its own technology instead of adopting open source table formats such as Iceberg (used by Cloudera) or Delta, used by Databricks.
"Given Teradata's longtime positioning for extreme, complex analytics, going to the data lake is a natural move. They are still doing so on their own terms as their data lake table format is not using Iceberg or Delta open source. But never say never there," he said.
Teradata's cloud strategy is an effort to grab some market and customer attention from so-called cloud-native data warehouse systems such as Snowflake, AWS Redshift, Google's BigQuery and Microsoft Azure's Synapse. But it might struggle convincing younger developers to use it given its long history of on-prem systems, Baer said.
"The cloud gives Teradata a chance to expand their footprints with existing customers to take on more discretionary workloads, but they still have their work cut out for them embracing developers, most of whom probably don't know Teradata or view it as 'their father's platform'," he said.
View original post here:
Teradata takes on cloud-native rivals with data lakes, MLOps - The Register
Do this if you don’t want to lose the files stored in the cloud – Gearrice
In the cloud we can store all kinds of documents and files. Many of them may be important, so losing them could be a serious problem. But of course, in the end we depend on a service that we are using. It is not as if we had a physical hard drive, of our own, where we are going to be able to control what happens.
The first reason is that the platform stops working. This does not usually happen overnight, but it can happen that you are using a secondary service, which does not have many users, and it closes. Maybe they warn you with a certain amount of time but you dont realize it because you dont use it frequently and when you want to access you find that no longer exists.
If you use stable platforms, such as Google Drive, OneDrive or Dropbox, this will not happen. Not in the short term at least. But it can happen if you use other less popular and more unstable services.
Another thing to keep in mind is that you can lose your cloud files if they expire. There are many services that allow you to store content but up to a certain time. Once that time has passed, they are automatically deleted by the platform and we can no longer access them.
Is this common? Well, it depends on the service. Once again, it is important to use secure cloud storage platforms to avoid these problems. Always check if the files are going to expire after a while or if they are going to remain there until you decide to delete them manually.
You cannot rule out that there is a computer attack. A hacker could exploit some vulnerability on the storage platform you are using and remove all user files. That could cause big losses, logically, but it is a possibility that is always there.
For this reason, it is advisable to make backup copies, at least of the most important files that may be irreparable. Only in this way will you always be protected and you will not run any risk in the event that any cloud service you use may be the victim of a cyber attack.
That attack may not be directed against the platform itself, but against your account. Could steal your password and access everything you have stored. That, inevitably, is going to put all the content at risk and they could delete it without you being able to do anything to prevent that from happening.
To avoid these it is essential to protect the account. Always use passwords that are strong and complex, keep computers up to date and with a good antivirus, as well as enable two-factor authentication whenever possible and create that extra layer of protection against account theft.
But of course, human error can also be the cause of files to be deleted on the cloud. Maybe you have a folder with important documents and, by mistake, you hit delete. In that case the files would be deleted and would no longer be available on that platform, so it would be the same as if you erase a hard drive.
Now, sometimes these platforms have a trash can, similar to the one in Windows, so you can go there and restore deleted files. However, it is not something that is always available and therefore you must be careful when you are going to delete a file and not make a mistake.
In short, for these reasons you could lose the files you have stored in the cloud. It is important that you take certain precautionary measures so as not to have problems and thus always keep the documents safe, without losing them and that being a major problem.
Follow this link:
Do this if you don't want to lose the files stored in the cloud - Gearrice
Wasabi Technologies Adds to Leadership Team in Japan and Australia to Support the Demand for Hot Cloud Storage Across Asia-Pacific – Sports Video…
Wasabi Technologies, the hot cloud storage company, has expanded its leadership bench in the Asia Pacific region with the additions of Aki Wakimoto and Andrew Sandes as Country Managers for Japan and Australia respectively. Wakimoto and Sandes will drive customer and partner growth as Wasabi continues its full-scale APAC expansion to meet the demand for high-performance, affordable cloud storage in this digital-first region.
Wasabi has undertaken an expansive rollout to support the region with cloud storage that is 1/5th the cost of hyperscalers, with no fees for egress or API requests and no vendor lock-in. Businesses are able to securely and affordably store all of their data and access it the moment they need it without complex pricing tiers. The executive hires of Wakimoto and Sandes follow the opening of four storage regions Tokyo, Osaka, Sydney, and most recently Singapore and the appointment of long-time APAC industry veteran Michael King to Vice President, General Manager of APAC in May 2022.
APAC is quickly becoming one of Wasabis most important markets, and we are strategically building our operations to support the incredible opportunities in both Japan and Australia, said King. I have witnessed firsthand how successful Aki and Andrew have been in their markets, building optimized channels of distribution and driving customer success. They will take us to the next level.
As Country Manager for Japan, Wakimoto will spearhead Wasabis growth strategy in the country, working closely with King to build Wasabis go-to-market team, drive value for channel partners and customers, and help deliver storage solutions that maximize the ROI of data across organizations. Wakimoto brings over 20 years of experience in the Japanese IT industry to her new role at Wasabi. Previously she served as President and Representative Director of SolarWinds Japan where she grew the companys customer and channel business, and implemented a comprehensive IT Operations Management (ITOM) product portfolio to support the digital transformation of Japanese enterprises. She has also held a number of impactful sales and operations roles with CA Technologies, Citrix Systems, and Adobe Systems and was also President of Pulse Secure Japan.
Sandes holds over 20 years of experience across Oceania, Japan and the wider Asia Pacific region, helping customers solve their biggest problems using leading technologies like cloud storage. Prior to Wasabi, Sandes served as Country Manager for Australia and New Zealand at Empflifi and was previously GM Asia Pacific at Lithium Technologies (now Khoros) working with large enterprises across the region to provide solutions to high scale online and social customer service challenges. Prior to Lithium, Sandes spent five years working in corporate advisory, executing cross border mergers and acquisitions, and capital raising transactions in the technology, media, and telecommunications sectors. He has also held several sales and business development roles across the technology sector in APAC. With Wasabi, Sandes will focus on go-to-market strategies with Australian partners, build a team to support Wasabis channel, and evangelize the value proposition of cloud storage to customers in the country who are dealing with concerns like ransomware, data sovereignty, and unpredictable costs of hyperscale storage providers.
Read the original here:
Wasabi Technologies Adds to Leadership Team in Japan and Australia to Support the Demand for Hot Cloud Storage Across Asia-Pacific - Sports Video...
Title: What Is Cloud-Based Healthcare? – HealthLeaders Media
Providers needed a better way to integrate their clinical and non-clinical information to understand their patients needs and preferences. Without it, clinicians risk losing patients whose expectations around access and communication have risen dramatically. According toNCR Health, 80% of patients would switch providers for convenience factors alone.
What is cloud-based healthcare?
A cloud solution for healthcare solves many common challenges and helps providers focus on what matters most: the patient.
The cloud acts as a place where you can securely store your data and access it from anywhere. Non-cloud storage solutions keep information on a physical hard drive or internal server. The cloud, however, allows data to live on a global network ofsecure data centers.
The cloud can easily connect to the EHR or any other system of record through an API, allowing organizations to centralize data and take advantage of app libraries.
Healthcare organizations use the cloud for connecting, storing, and maintaining traditional personal health information such as blood test results, as well as other consumer data like contact preferences. All information is accessed securely from a single console across the organization. EHRs arent replaced; they become an integral part of the cloud system.
The cloud can easily connect to the EHR or any other system of record through an API (application programming interface), allowing organizations to centralize data and take advantage of app libraries like AppExchange. App libraries providetailored solutionsto help solve different healthcare business needs, like verifying patient medical insurance. An API can take data from a contact center, marketing database, or any other system and bring all the information together in one place.
On the back end, the cloud provider maintains the software and develops enhancements, so healthcare IT professionals dont have to worry about facilitating updates. This frees them to customize the platform to meet the unique needs of their business.
How can a cloud solution improve the work of healthcare organizations?
Providers have many sources of information that create data silos. The time wasted searching multiple systems and the lost opportunity to draw insights from data frustrates both employees and patients. Heres how cloud-based healthcare alleviates the burden for everyone:
Streamline operations
Teams can access all the data they need from a single program or app, instead of switching between systems, and they can see everything at an aggregate level. This makes it easier to quickly answer patient questions, manage preferences, and turn insights into actions.
Organizations can also use automation to handle time-consuming tasks. Consider a patient who needs to schedule a knee replacement. Instead of the appointment desk scanning their calendar manually for an available slot, the cloud can automatically suggest possible times via the channel of their choice. The system can also deliver a series of emails with important information from pre-visit instruction to post-visit health tips.
Personalize patient care
With a single source of truth for data, care coordinators can access a complete view of a patients health. The coordinator can easily view the treatment history, social determinants of health, recent tests and procedures, and the extended care team. They can also access questions the patient may have submitted through a portal.
At the same time, intelligence can analyze data to help providers stratify risk, ensuring those most in need are not overlooked. This frees clinicians to focus on interacting with the patient instead of devoting time to record-keeping in the EHR.
Engage anywhere
The cloud enables clinicians and other care providers to access patient information even outside the office.
For example, a physician who is at home, at another facility, or away at a conference can still access patient information from a single dashboard. Or, when patients need to see a provider but time constraints, weather conditions, or lack of transportation make it challenging, they can opt for a virtual appointment instead. Ultimately, patients have more options to engage with their providers, and providers have more access to their patients. This connectivity helps reduce time to care and improve patient outcomes.
Click here to learn more about Cloud-Based Healthcare.
More:
Title: What Is Cloud-Based Healthcare? - HealthLeaders Media
Dutch government finally allowed to use public cloud – ComputerWeekly.com
The public cloud market has seen huge developments over the past decade, with the Covid pandemic being an important accelerator. Cloud services have become more reliable, and are currently used by large numbers of citizens and businesses. The security of public cloud services has increased, and the large-scale deployment of updates and patches makes it far easier than before to fix errors in software. For these reasons, it was due time to revise the National Cloud Policy 2011.
The new policy allows Dutch government departments to use public cloud services. Public cloud services offer an appealing perspective for the development of a more innovative, transparent, flexible and efficient digital government,State secretary for digitisationAlexandra van Huffelen wrote to the Dutch Lower House. Low initial costs and pay-per-use make the public cloud a transparent solution.
Moreover, the risks are now more manageable than before, due to large investments by public cloud providers in securing their services. This is much more than the government is willing or able to invest in information security itself.
Hence, the road to public clouds is finally free for Dutch public services, albeit under strict conditions.
Conditions for use incorporate, for example, the processing of personal data. Public clouds will not be allowed for use for basic registry, nor for the storage and processing of special personal data. All storage and processing of personal data has to comply with the General Data Protection Regulation.
Furthermore, civil servants are not allowed to store state secrets in any public cloud. Neither are they supposed to use cloud services from countries with an active cyber programme that is aimed against Dutch interests. Every Dutch government department itself is responsible for assessing and monitoring any relevant risk of using a public cloud service. The Dutch Ministry of Defence still remains excluded from the new policy, and will not be allowed to use public cloud services.
Even though Van Huffelen is rather positive regarding the new national cloud policy, she is aware that risks remain even indirectly. For example, should a US cloud services provider be acquired by a Chinese state-owned enterprise, the use of that particular public cloud would no longer be permitted by Dutch public services.
All departments formulate their own cloud policy and strategy, based on the new National Cloud Policy. Those bodies of the government that do not form part of the civil service are requested to follow this advice. In addition, departments are required to incorporate an exit strategy in their contracts with public cloud providers to make sure that, in case of the acquisition example above, they are assured of immediate cancellation of the service. This exit strategy should also indicate how the data will be returned and destroyed on the side of the provider.
The digital world is not without risks, said Van Huffelen. Not even if we had a fully self-managed cloud.
According to the Dutch Central Bureau of Statistics, in 2020, 53% of Dutch businesses used the cloud, of which 39% used a public cloud.
Van Huffelen wrote in her letter that those businesses are also demanding a high degree of security and privacy. The Dutch central bank (DNB) has pioneered in managing cloud risks in the financial sector.
Today, 49% of Dutch banks use the cloud, of which 38% use a public cloud. Almost 60% of healthcare organisations in the Netherlands use a private cloud, and 43% of them also use a public cloud.
Moreover, research shows that over 50% of government organisations worldwide use office applications from the cloud. These numbers were an important accelerator to revise the Dutch National Cloud Policy of 2011.
Van Huffelen plans to start evaluating the new Cloud Policy from 2023.
Read more from the original source:
Dutch government finally allowed to use public cloud - ComputerWeekly.com
XenData kit takes tape copies of cloud archives Blocks and Files – Blocks and Files
Archiving system supplier XenData has launched an appliance which makes a local tape copy of a public cloud archive to save on geo-replication and egress fees.
XenDatas CX-10 Plus is a networked rackmount box containing an SSD system drive and a 14TB disk drive cache. Its system software sends incoming data out to the public cloud and retains a local synchronized copy of the cloud archive files on LTO tape cartridges.
CEO Phil Store stated: The CX-10 Plus has two key benefits. Creating a local synchronized copy of every file written to the cloud gives peace of mind. And the Appliance easily pays for itself by minimizing cloud storage fees.
Thats because having the local tape copy of the archive means customers dont have to pay public cloud geo-replication fees and, with restores coming from local tape, they dont have to pay public cloud egress fees either. The CX-10 Plus pricing starts from $11,995, which doesnt cover the necessary tape appliance two managed LTO drives or an LTO autoloader.
The systems data flow starts with one or more users sending it files for archiving. They land on a 14TB disk drive. A multi-threaded archive process then writes the data to the public cloud AWS S3 (Glacier Flexible Retrieval, Deep Glacier), Azure Blob (Hot, Cool and Archive tiers), Wasabi S3, and Seagate Lyve.
All archive files uploaded to the cloud are retained on the disk cache for a defined retention period, typically a day; the disk is not that big. Every few hours the files are synchronized to LTO creating a mirror copy of the file-folder structure that has been archived to the cloud.
XenData says the CX-10 Plus is optimized for media archives enabling users to play video files directly from the cloud. It integrates with many media applications including Media Asset Management systems.
The CX-10 Plus fits into XenDatas appliance range:
An S3 object storage interface may be added to any XenData LTO Appliance via a software upgrade. This creates a private cloud that competes with public cloud storage services such as AWS Glacier and the Archive Tier of Azure object storage.
XenData was started in 2001 and has a headquarters office in Walnut Creek, California, and an outlying office in Cambridge, UK. The CX-10 Plus will be available in September.
Read the rest here:
XenData kit takes tape copies of cloud archives Blocks and Files - Blocks and Files
AWS Preps ‘Bastion’ Cloud Service for Advertisers – The Information
Amazon Web Services is preparing to unveil a cloud service to help companies improve the way they target ads to potential customers without violating data privacy laws, according to three people with knowledge of the product. The move comes as advertisers try to recover from, and get ahead of, Apples and Googles restrictions on their ability to track consumers online.
The new AWS service, Bastion, is known in tech industry parlance as a data clean room. It lets multiple companies pool data they have on existing or potential customers without any of them being able to view the entire pool. The idea is to protect the identity of customers, both for privacy reasons and out of competitive concerns. In these metaphorical rooms, companies pooling datafor instance, a retailer like Target and a streaming service that sells ads, such as HBO Max or Hulucould see how much overlap there is between their respective customers and use that to determine whether Target should target new or repeat customers through ads on the streaming services.
Read the rest here:
AWS Preps 'Bastion' Cloud Service for Advertisers - The Information
Should I buy Siacoin (SC) at the current price? – Invezz
Siacoin (SC) has weakened from $0.0052 to $0.038 since August 11, 2022, and the current price stands at $0.0039.
The cryptocurrency market tumbled this Friday after Federal Reserve Chair Jerome Powells hawkish speech at the Jackson Hole conference in Wyoming, and for now, everything indicates that we could see new lows.
Are you looking for fast-news, hot-tips and market analysis? Sign-up for the Invezz newsletter, today.
Siacoin (SC) is a decentralized platform for cloud storage that allows any computer running it to rent out unused hard drive space to users looking to store files.
Because of this, Siacoin is branded as an AirBnB for hard drives, and it is important to say that Siacoin ensures that all data is protected against censorship theft and the users are never denied access to the data by miners, hackers, government bodies, or developers.
Siacoin network is secured by blockchain technology, and files stored through the Sia network are encrypted and sliced into tiny pieces.
Siacoin is not the only cryptocurrency project seeking to disrupt the storage market, and it faces competition from other decentralized cloud storage systems like Filecoin, Storj, and MaidSafe.
Despite this, Siacoin is still among the most popular service networks of its kind, and it resolves some of the most common problems in the sector of cloud storage, including hacking risks, high costs for renting storage, data control, and data mismanagement.
SC is the native cryptocurrency of the Siacoin network, and those looking to store files must buy and spend SC tokens in order to store files. SC is the only currency that can be spent within the network, and investors should consider that SC doesnt have a finite supply, which means that new SC can be mined infinitely.
Technically looking, Siacoin (SC) remains under pressure, and if you decide to buy this cryptocurrency in September 2022, you should consider that the price can weaken even more.
The cryptocurrency market tumbled this Friday after Federal Reserve Chair Jerome Powells said at the Jackson Hole conference in Wyoming that the US central bank would not pause its campaign to bring price growth down.
The rising risks of the recession and the uncertainty because of the Russian-Ukrainian war remain in the focus of investors, and we will probably see new lows for the cryptocurrency market in the days ahead.
Siacoin (SC) currently trades around the $0.0039 level, but it would be a strong sell signal if the price falls below $0.0030 support.
The next price target could be around $0.0025 or even below; still, if the price jumps above $0.0060, we have the open way to $0.0070.
Siacoin (SC) is a decentralized platform for cloud storage that allows anyone to rent out spare hard drive space or utilize other peoples spare hard drive space to store files. The cryptocurrency market tumbled this Friday after Federal Reserve Chair Jerome Powells hawkish speech at the Jackson Hole conference in Wyoming, and for now, everything indicates that we could see new lows for Siacoin (SC).
To invest simply and easily, users need a low-fee broker with a track record of reliability. The following brokers are highly rated, recognised worldwide, and safe to use:
*Cryptoasset investing is unregulated in some EU countries and the UK. No consumer protection. Your capital is at risk.
Read the original:
Should I buy Siacoin (SC) at the current price? - Invezz