Category Archives: Cloud Servers
The Mystery Behind the Aarogya Setu App – TheLeaflet – The Leaflet
The creation of an exclusive, State-driven, contact tracing enhancer for smartphone users in India during Covid times has multiple faults. The primary one is that there is considerable mystery about where the data is being stored and to what extent it is protected. Some code has been released relating to the working of the app, but it has not brought any clarity as it falls short of both completeness and openness, revealsVICKRAM CRISHNA.
-
On May 1, 2020, the government notified the release of an app calledAarogya Setu (health bridge) in response to Covid-19. It was touted as an intelligent solution, but it is anything butthat, as per reviews. The technical premise that smartphones are capable of authoritatively detecting risky physical proximity to potentially infected persons is specious.
Apart from the flaky claims about the technical capabilities of the phones themselves, there are troubling lacunae concerning opacity around the creation of the software code. It is the code that determines the routing of extensive data collection from users of the tool. As a result, there is considerable mystery about where the data is actually being stored and to what extent it is protected from any use outside of this particular health crisis.
TheSupreme Court judgmentaffirming the right to privacy, on August 24, 2017, clearly lays down the means by which the fundamental contract between citizen and the State is to be handled, and this applies to data collection also.
The lack of clarity about the writing of the app claimed and denied with equal enthusiasm by the alleged team and the organs of the State allegedly charged with responsibility for it, further confounds the situation.
Apparently, a software businessman put together a team, drawing contributors from companies with which he is associated with or controls, calling them volunteers, although they apparently continued to draw salaries while working on the code.
Neither the team nor any individuals who are claiming credit for the work are actually contracted with the government for this effort, or, at least, no such contract has yet come to light.
If they are indeed volunteers, it may seem laudable at a glance. Except that, sans a contractual relationship, there is complete and total deniability of accountability for breaches of the Constitution implicit in the collection and use of data by the software packages.
Apparently, a software businessman put together a team, drawing contributors from companies with which he is associated with or controls, calling them volunteers, although they apparently continued to draw salaries while working on the code. Neither the team nor any individuals who claim credit for the work are actually contracted with the government for this effort, or, at least, no such contract has yet come to light.
Unfortunately, there is no way, short of full investigative access to both the code and to transactions between the State and private players, to determine whether such breaches have taken place or will do so in future.
The State itself has put out equivocal statements about the possibility that some data might be held for future use.
On the one hand, the entire exercise has been conducted in flagrant breach of long-standing publicly declared government policy on the use of Free Software for public service projects.
To further muddy the waters, the minutes of crucial meetings of an Empowered Group to handle the data, among other responsibilities, have been obtained with considerable difficulty through the Right to Information Act andpublished on December 3, 2020. They reveal that it has all along been the intention to widely disseminate tracking data to various State players. As the authors of the report stress, such an Empowered Group itself does not have any legal status under the Disaster Management Act, 2005.
The State itself has put out equivocal statements about the possibility that some data might be held for future use. On the one hand, the entire exercise has been conducted in flagrant breach of long-standing publicly declared government policy on the use of Free Software for public service projects.
Free Software means that the code is open to inspection for anybody (and is, therefore, sometimes called Free and Open Source Software), not that anybody pays for it. Software (computer or digital) is one of the most vibrant and accessible forms of technology of the modern world, except when it is done opaquely.
In this case, there are two broad kinds of software tools involved in making this app work. One is a piece of code that runs on the phone (smartphones are wireless telephone instruments that work on inbuilt computer chips), and the second is a code that runs on a server connected to the internet. Smartphones are usually also enabled to connect to the internet by telecom service providers in India.
Some code has been released, over time, relating to the working of the app. Very ominously, the release has not brought clarity, for the simple reason that the code is not clear.
Neither the code originally released forthe appnor forthe serverwas published in a professional manner, as revealed by competent technical analyses.
The minutes of crucial meetings of an Empowered Group reveal that it has all along been the intention to widely disseminate tracking data to various State players. As the authors of the report stress, such an Empowered Group itself does not have any legal status under the Disaster Management Act, 2005.
Code releases for the smartphone apps began six months ago in May and are being tracked and published regularly as a public service. The backend server code was only released on November 20, and, as with the preliminary releases of app code, falls short of both completeness and openness. There is no explanation for the lengthy delay in releasing what turns out to be some incomplete and obfuscating sections of server code.
It is, therefore, still not possible to state firmly that the data collected is being deleted from both the phone (after 30 days, the phone data is removed) and from the server (claimed to be hosted within Amazon cloud servers located in Mumbai). Somewhat grimy thumbprints on the code fragments, now in the public domain, demonstrate that the coding teams continue to use privately owned servers to route data, not exactly a shining mark of good faith.
Rather, the Indian public is expected to vest confidence in the intentions and abilities of the coders and the State itself regarding tracking data collected from users and whether it will be handled in accordance with the law. These users include those who have been compelled and coerced to install the app by both private and public entities (public coercionwas reluctantly withdrawnafter extended protests).
(Vickram Crishna is a trained engineer and manager. The authors case against the Union of India and Others, opposing the operation of the state-operated technology-based national identification scheme, also resulted in a definitive judgment affirming the fundamental right to personal privacy. The views are personal.)
See the article here:
The Mystery Behind the Aarogya Setu App - TheLeaflet - The Leaflet
People Cant Vacuum Or Use Their Doorbell Because Amazons Cloud Servers Are Down – Gizmodo Australia
Amazon Web Services (AWS) is having problems. And unfortunately for anyone who owned a vacuum or doorbell that relied on AWS, a lot of the so-called smart objects are no longer functional.
AWS is the worlds biggest cloud hosting software. In just over a decade, Amazons product has essentially become the backbone of the internet.
And early on Thursday morning Australia time, one of its major server centres began to have an issue not for the first time either.
Soon after people noticed, Amazon acknowledged the problem on its internal dashboard for users.
We continue to work towards recovery of the issue affecting the Kinesis Data Streams API in the US-EAST-1 Region. We also continue to see an improvement in error rates for Kinesis and several affected services, but expect full recovery to still take up to a few hours, it read.
Many of the services that you know and love Adobe cloud software, 1Password, Flickr have all been having issues because of the outage.
But there were some unexpected issues as well: namely, a lot of people began to notice things that they didnt know needed cloud computing were actually very reliant on them.
First, peoples beloved vacuum robot Roombas stopped working.
One user tweeted about Amazon in frustration Some part of AWS is down and apparently its screwing up the Roomba,
The company that produces Roombas, iRobot, confirmed they (along with their robot mops) were no longer working.
And next came the doorbells:
My fucking doorbell doesnt work because AWS us-east-1 is having issues, another user tweeted.
Some people even lost control of their Christmas lights!
Anyone else unable to turn on their Christmas lights because of the AWS outage? one Twitter user asked.
I guess what we can take from this is: probably not a good idea to have half of the infrastructure of the internet depend on the same company, hey?
Visit link:
People Cant Vacuum Or Use Their Doorbell Because Amazons Cloud Servers Are Down - Gizmodo Australia
Inside Innovation: ‘In the cloud we trust’ doesn’t hack it for ransomware protection – Daily Commercial News
Software programs and platforms secured in the cloud can give companies a false sense of security against the growing number of ransomware attacks. After all, the cloud is just another way of saying, someone elses computer server.
The risk for a company obligated to protect vital data is the temptation to reduce or eliminate internal firewall protections and hand over security entirely to their cloud-based application suppliers.
As companies make use of the public cloud, they need to evolve their cybersecurity practices dramatically in order to consume public-cloud services in a way that enables them both to protect critical data and to fully exploit the speed and agility that these services provide, says consultancy firm McKinsey Global.
Cyber attacks cost companies, government, health care and education entities billions of dollars each year, as much as $7.5 billion in 2019 in the U.S. alone. In Canada, steel manufacturer Stelco suffered losses from temporary production shutdowns. Other losses can take the form of recovery costs and legal implications, in addition to outright ransom demands, costs that can exceed any insurance the victim company may have in place.
Todays attackers are patient, staying active and undetected for long stretches of time an average dwell time of 56 days according to recent research from U.S. cyber security firm Mandiant. Dwell time is defined as the length of time cyber-attackers have free reign in networks until eradicated.
The U.K.s National Cyber Security Centre (NCSC), an independent authority on online security, recently issued updated warnings to help companies recover and reduce the costs of cyber and ransomware attacks.
Most companies, of course, rely on some form of key information backup. However, the NCSC pointed out, that backup data isnt much good if its also infected with ransomware, and thus encrypted and unusable, because it was still connected to the network when the attack took place.
Weve seen a number of ransomware incidents lately where the victims had backed up their essential data (which is great), but all the backups were online at the time of the incident (not so great), the agency continued. It meant the backups were also encrypted and ransomed together with the rest of the victims data.
Since ransomware can dwell in networks for long periods before detection, the encrypted malware may be recycled into backups before detection.
The NCSC recommends that organizations keep their backups offline and separate from their networks. Cloud-based security applications offered by services such as Dropbox, OneDrive, SharePoint and Google Drive should not be sole methods of backup. In addition, the NCSC suggests no physical backup drives or USB sticks be left permanently installed in computers.
Geoff Bourgeois, CEO of Canadian data storage firm HubStor, agrees.
Cloud storage is not inherently immune to ransomware.
He cites the vulnerabilities resulting from syncing the cloud with local data storage methods.
When ransomware strikes, it is going to rip through your files locally and encrypt them, and the file sharing engine is going to sync this change to the cloud storage copy as well. The same concept is true in enterprise scenarios with cloud storage gateways or other storage tiering solutions. The local copy is likely to become encrypted by the ransomware and sync up to the cloud.
One answer is cloud storage offering versioning.
Withversioning, the idea is that existing versions of your data are immutable, says Bourgeois. Since they cannot change, any modification is going to result in a new version. Versioning is, therefore, an advantage against ransomware because the encryption attack is effectively going to result in a new version of your infected files.
McKinsey suggests companies develop a multi-point strategy to form a cloud-centric cyber security model aligned to their risk tolerance. This would include determining how much security is handed to cloud-based suppliers versus maintaining internal control. Only a rethink of data protection and recovery can successfully combine the move to cloud-based applications and storage with resistance to the persistent waves of ransomware attacks.
John Bleasby is a Coldwater, Ont.-based freelance writer. Send comments and Inside Innovation column ideas to editor@dailycommercialnews.com.
Read this article:
Inside Innovation: 'In the cloud we trust' doesn't hack it for ransomware protection - Daily Commercial News
Will edge computing become the new cloud in 2021? – TechRepublic
Industry experts expect increased investments in edge capacity to reduce latency and support personalized content delivery and custom security policies.
Gartner predicts that by 2024 at least 40% of enterprises will have plans to adopt secure access service edge services, up from less than 1% at the end of 2018.
Image: Gartner
IDC predicts that the edge computing market worldwide will grow to $250.6 billion by 2024. Dave McCarthy, the firm's research director of edge strategies, thinks edge products and services will power the next wave of digital transformation.
Companies will need to think about how to build out edge capacity with new infrastructure and services. Industry experts predict that companies will start to incorporate edge computing capabilities into the software deployment pipeline and use this infrastructure to support personalized content and streaming services.
Ali Fenn, president of data center consulting firm ITRenew, said that edge computing is fundamentally about how to do compute close to data sources and data users.
"2021 will see major enterprises and tech companies driving towards homogenous, cost-effective infrastructure across these tiers, from public cloud, to private in colos, to modular commercial, and to consumer-proximal," he said. "Winners will look to advanced and modern paradigms for IT and networking, and untether from conventional stacks, racks, and vendors, to deliver plug-and-play, operationally efficient IT, at the lowest possible cost."
SEE: Top cloud providers in 2020: AWS, Microsoft Azure, and Google Cloud, hybrid, SaaS players (TechRepublic)
Keith Higgins, vice president of digital transformation for Rockwell Automation, describes the edge as the new cloud. He predicts that real-time availability of mission-critical workloads will be vital for companies scaling smart factory initiatives in 2021.
"Edge computing will complement existing cloud infrastructure by enabling real-time data processing where the work takes place: motors, pumps, generators, or other sensors," he said.
The industry will continue to move toward more decentralized compute environments, and the edge will add significant value to digital transformation initiatives.
"By integrating edge functionalities with existing cloud infrastructure, organizations will worry less about logistical IT considerations and, instead, focus on rethinking what's possible in a smart machine," he said.
Here's a preview of what 2021 trends in edge computing will look like.
Aruba launched its Edge Services Platform (ESP) in June 2020 as an AI-powered, cloud-native platform to automate, unify, and protect the edge.
Aruba CTO Partha Narasimhan said the company defines "the edge" as where the users and the actions are, which could be offices, sports stadiums, or homes. He said that edge computing provides the infrastructure to understand how people use physical spaces as well as to deliver experiences.
"If you create a connectivity layer that is always on, seamless, and secure, that drives participation," he said.
One trend in edge computing that he sees is a push to bring services closer to the connectivity layer to reduce latency.
"Authentication could be centralized but fulfillment will be local," he said. "Companies will have to rethink how to extend zero trust to the edge."
Narasimhan said that Aruba has built its services around a zero-trust model since 2002.
"We don't trust our own access points, they have to authenticate," he said. "Built-in authentication determines what policy gets applied to each device."
Automating some of this process is the only way edge computing will work well, Narasimhan added.
"Policies should be used on demand when and where you need them," he said. "That means not just automation for provisioning from the cloud, but provisioning locally where security rights and policies have to be plumbed in where the user connects."
SEE: 5 Internet of Things (IoT) innovations (free PDF) (TechRepublic)
Narasimhan said that companies should use two types of tactics to accomplish this: Automation that relies on scripting and automation that learns and changes over time. Developers should monitor how services are working, measure user satisfaction, and monitor new devices and use this information to tweak automated processes as needed.
"This learning type of automation is even more important at the edge," he said.
Narasimhan also sees potential in low-power sensors than run on ambient light or ambient radio-frequency energy to expand compute power and data collection at the edge.
"If you can run these lick and stick sensors without batteries, you'll deploy more of them and you're not just installing sensors but also building analysis power," he said.
Companies should think beyond traditional workloads and network design to leverage all forms of available infrastructure to expand edge compute options, Narasimhan said.
Aruba predicts these business outcomes will be powered by edge computing:
Limelight Networks is expanding its expertise with content delivery networks to the world of edge computing. Steve Miller-Jones, vice president of edge strategy and solution architecture at Limelight, said that the company is helping industrial, retail, business, and telco customers distribute apps, content, and services in disparate locations served by Limelight's private network.
"Edge functions allows you to customize the experience and gives you instant scale into more locations than most clients can easily manage," he said.
SEE: Future of 5G: Projections, rollouts, use cases, and more (free PDF) (TechRepublic)
Miller-Jones said he has seen a big shift over the last year in Limelight customers using the edge as code.
"Customers are integrating edge capability into their deployment pipeline to deploy their own processes and workflows," he said. "They don't have to do any distribution, the code just runs where their requests show up."
Customers are using edge capabilities to personalize content, aggregate data, and direct traffic.
"Companies are taking those decisions and abstracting them into code," he said.
Using a serverless approach is key to reducing latency and getting the ROI out of edge solutions, Miller-Jones said.
"If you can use a serverless environment and have processes run on demand, that delivers the value of edge sensors and you get more agility and operational ability," he said.
This is particularly importanting for the gaming industry to be able to offer customers faster access to servers and more intelligent routing without having to send everything to the cloud.
Miller-Jones said that he expects to see an expansion of capacity to support streaming services as well as content personalization.
As more workloads move to the cloud, chief information security officers are looking to take a cloud-first approach to security, Gartner Senior Analyst Nat Smith said, and that's where secure access service edge (SASE, pronounced sassy) services come in.
"SASE is largely about connecting people or devices to service, which can be private or public," he said. "The big advantage of this approach is that IT teams don't have to set up and maintain access."
This approach reduces operational mistakes which in turn improves security, Smith said.
Gartner predicts that by 2024 at least 40% of enterprises will have plans to adopt SASE up from less than 1% at the end of 2018.
In a research paper about SASE "The Future of Network Security is In the Cloud," Gartner analysts determined that "digital business and edge computing have inverted access requirements with more users, devices, applications, services and data located outside of an enterprise than inside."
Also, complexity, latency, and the need to decrypt and inspect encrypted traffic one time will increase demand for consolidation of networking and security capabilities into a SASE platform delivered via the cloud. Companies also need a "worldwide fabric of points of presence" to ensure low-latency to users and devices.
To accomplish this, Gartner recommends IT security leaders take these steps:
Companies should look for security vendors with significant experience setting up other customers in the cloud and protecting those environments.
"Moving things to the cloud is a fairly important trend from a security perspective because in most cases we have just opened up a lot more of our real estate and we're not protecting it as well as we would like to," he said.
This is your go-to resource for XaaS, AWS, Microsoft Azure, Google Cloud Platform, cloud engineering jobs, and cloud security news and tips. Delivered Mondays
The rest is here:
Will edge computing become the new cloud in 2021? - TechRepublic
Cloud Server Market ? What Factors Will Drive The Market In Upcoming Years And How It Is Going To Impact On Global Industry | (2020-2026) – The…
DataIntelo offers a detailed report on Global Cloud Server Market. The report is a comprehensive research study that provides the scope of Cloud Server market size, industry growth opportunities and challenges, current market trends, potential players, and expected performance of the market in regions for the forecast period from 2020 to 2027. This report highlights key insights on the market focusing on the possible requirements of the clients and assisting them to make right decision about their business investment plans and strategies.
The Cloud Server market report also covers an overview of the segments and sub-segmentations including the product types, applications, companies and regions. This report further includes the impact of COVID-19 on the market and explains dynamics of the market, future business impact, competition landscape of the companies, and the flow of the global supply and consumption. The report provides an in-depth analysis of the overall market structure of Cloud Server and assesses the possible changes in the current as well as future competitive scenarios of the Cloud Server market.
Request A Free Sample Report @ https://dataintelo.com/request-sample/?reportId=86589
The published report consists of a robust research methodology by relying on primary source including interviews of the company executives & representatives and accessing official documents, websites, and press release of the companies. DataIntelo is known for its data accuracy and granular market reports.
The report is prepared with a group of graphical representations, tables, and figures which displays a clear picture of the developments of the products and its market performance over the last few years. With this precise report, it can be easily understood the growth potential, revenue growth, product range, and pricing factors related to the Cloud Server market. The report also covers the recent agreements including merger & acquisition, partnership or joint venture and latest developments of the manufacturers to sustain in the global competition of the Cloud Server market.
Key companies that are covered in this report:
IBMHPDellOracleLenovoSugonInspurCISCONTTSoftlayerRackspaceMicrosoftHuawei
*Note: Additional companies can be included on request
The report covers a detailed performance of some of the key players and analysis of major players in the industry, segments, application, and regions. Moreover, the report also considers the governments policies in different regions which illustrates the key opportunities as well as challenges of the market in each region.
By Application:
EducationFinancialBusinessEntertainmentOthers
By Type:
Logical TypePhysical Type
As per the report, the Cloud Server market is projected to reach a value of USDXX by the end of 2027 and grow at a CAGR of XX% through the forecast period (2020-2027). The report describes the current market trend of the Cloud Server in regions, covering North America, Latin America, Europe, Asia Pacific, and Middle East & Africa by focusing the market performance by the key countries in the respective regions. According to the need of the clients, this report can be customized and available in a separate report for the specific region.
You can also go for a yearly subscription of all the updates on Cloud Server market.
You can buy the complete report @ https://dataintelo.com/checkout/?reportId=86589
The following is the TOC of the report:
Executive Summary
Assumptions and Acronyms Used
Research Methodology
Cloud Server Market Overview
Cloud Server Supply Chain Analysis
Cloud Server Pricing Analysis
Global Cloud Server Market Analysis and Forecast by Type
Global Cloud Server Market Analysis and Forecast by Application
Global Cloud Server Market Analysis and Forecast by Sales Channel
Global Cloud Server Market Analysis and Forecast by Region
North America Cloud Server Market Analysis and Forecast
Latin America Cloud Server Market Analysis and Forecast
Europe Cloud Server Market Analysis and Forecast
Asia Pacific Cloud Server Market Analysis and Forecast
Middle East & Africa Cloud Server Market Analysis and Forecast
Competition Landscape
Why you should buy this report?
This report offers a concise analysis of the Cloud Server market for the last 5 years with historical data & more accurate prediction for upcoming 6 years on the basis of statistical information.
This report helps you to understand the market components by offering a cohesive framework of the key players and their competition dynamics as well as strategies.
The report is a complete guideline for the clients to arrive an informed business decision since it consists of a detailed information for better understandings of the current & future market situation.
The report also answers some of the key questions given below:
Which end-user is likely to play a crucial role in the development of the Cloud Server market?
Which regional market is expected to dominate the Cloud Server market in 2020-2027?
How is consumer consumption behavior impacting the business operations of market players in the current scenario of the Cloud Server market?
If you have any questions on this report, please reach out to us @ https://dataintelo.com/enquiry-before-buying/?reportId=86589
About DataIntelo:
We possess expertise in a variety of business intelligence domains. Our key analysis segments, though not restricted to the same, include market entry strategies, market size estimations, market trend analysis, market opportunity analysis, market threat analysis, market growth/fall forecasting, primary interviews, secondary research & consumer surveys.
We invest in our analysts to ensure that we have a full roster of experience and expertise in any field we cover. Our team members are selected for stellar academic records, specializations in technical fields, and exceptional analytical and communication skills. We also provide ongoing training and knowledge sharing to keep our analysts tapped into industry best practices.
Contact Info: Name: Alex MathewsAddress: 500 East E Street, Ontario, CA 91764, United States.Phone No: USA: +1 909 545 6473Email: [emailprotected]Website: https://dataintelo
Read the original:
Cloud Server Market ? What Factors Will Drive The Market In Upcoming Years And How It Is Going To Impact On Global Industry | (2020-2026) - The...
IP surveillance: The storage it needs, on-premise and in the cloud – ComputerWeekly.com
Surveillance systems demands huge amounts of storage media. With most surveillance based around image capture, that will always likely be the case.
That also means that input/output (I/O) will generally be sequential think one movie frame after another being written or read to a drive. But, that changes somewhat with the use of artificial intelligence (AI) to add intelligence to surveillance.
But what are the storage needs of surveillance systems, what type of storage media can be used for surveillance, and what kinds of storage products are aimed at surveillance?
Monitoring and retention of video camera footage still forms a central core to the world of surveillance, and so a key requirement of storage in surveillance is the ability to handle sequential I/O well.
Meanwhile, surveillance is increasingly being supplemented with AI or so-called vision intelligence to bring actionable insight to camera-captured imagery.
So, where once security cameras would simply have been monitored by humans and retained as a record, movement of potential interest can be identified and alerts raised, for example. Thats of obvious utility to physical security where systems can be primed to recognise an intruder by size, shape, movement, and so on.
But AI has also been implemented in a wide range of other surveillance applications such as phasing traffic lights to improve vehicle flow, monitoring retail centre footfall to adjust product placement, and being able to recognise concerns over wellbeing such as a fallen patient in healthcare settings.
But no matter what the application and level of AI applied to it, sequential I/O will likely be the core storage performance characteristic required for surveillance systems.
However, not all apparently sequential I/O is as smooth as you think when it comes to video. Some video codecs, for example, comprise different types of frames that represent full images and differential changes that contain only changes portions. A hardly changing image would see little in the way of change, until a person walked across the cameras view, which means I/O flow would be quite wave-like.
While the requirements of image capture are largely sequential, the application of analytical and machine learning functionality is likely to involve a whole load more randomness.
Thats because while ingest of an image is pretty sequential, reading and comparing patterns within images will make reference to potentially large amounts of existing patterns already saved. The AI side of things is potentially quite random in its I/O needs.
Spinning disk HDD
With sequential I/O forming the vast bulk of surveillance traffic, spinning disk HDDs are predominant in the field. Spinning disk is well-suited to sequential I/O as the mechanical nature of the drives means time savings are made when read heads dont need to cycle in and out of different locations on disk platters.
Drives can be SATA, or the more performant and more expensive serial-attached SCSI (SAS), in terms of back-end connectivity. Both will work well with the main job of video ingest.
Solid state and NVMe
However, for the kind of heavy lifting that analytics requires, solid state storage where random access of storage cells is done electronically rather than mechanically doesnt incur time penalties.
This kind of media means flash, or NVMe flash, perhaps with enhanced processing power from a GPU card will possibly be required.
That was the case with Hong Kong-based cloud service provider Vivavo, where it leveraged NVMe-based storage for facial recognition on street cameras.
Solid state is a lot more pricey than spinning disk, so it will often make sense to have a tier of flash where data being processed is held, while HDD is fine for bulk storage of video.
Tape
Tape is also a potentially useable bulk storage medium, and when tapes are not in use they consume no power, unlike HDDs. But access to data held on tape is a lot slower than other media. Automating that via use of LTFS a way of layering a NAS-like file system on top of tape would be one option. Using tape as another tier for bulk storage is another way to incorporate it.
Cloud storage
Finally, there is cloud storage, which in some senses may be an ideal medium for video surveillance footage and analytics, which can be leveraged in the form of as-a-service offerings in the cloud.
But there are drawbacks too, such as the bandwidth required to upload video that could multiply as the number of cameras increases. There are also charges for the retention of data in the cloud to be considered. Compliance may also be an issue if images are stored outside particular jurisdictions.
The easiest way to get IP surveillance is by buying a network video recorder (NVR) that comes with everything you need cameras, servers, connecting hardware. These may be well suited to many smaller businesses, but may also be limited in terms of expansion, customisation and integration possible.
When it comes to HDDs for surveillance storage, any enterprise drive will work. But with retention of large amounts of imagery for long periods with potentially small amounts of access, higher capacities with less rigorous performance requirements fit the bill. That most probably means 7,200rpm SATA most of the time.
Some hard drive vendors, such as Seagate with its Skyhawk AI products, sell HDDs which they claim to be optimised for surveillance workloads. These are SATA drives, with an emphasis placed on MTBF (mean time between failures); 2 million hours in the case of the Seagate product.
Increasingly, surveillance is moving to the cloud, in so-called video surveillance as a service, in which the customer just has cameras on-site plus the ability to connect to the internet.
With so-called vSaaS one big trade-off is ongoing data storage costs against the one-time outlay for an NVR product. Having said that, cloud costs are likely to be more predictable than spend on on-site storage capacity or dealing with failed components.
Issues could also arise if internet connectivity is broken and images are not captured. Hybrid systems have been developed that use some on-site storage to counter that eventuality.
A big plus for cloud-based surveillance is that the added intelligence of AI can be added as a service to existing provision.
Read more:
IP surveillance: The storage it needs, on-premise and in the cloud - ComputerWeekly.com
Does AD CS Work in the Cloud? – Security Boulevard
Digital certificates have taken over as the preferred method of network authentication because of their proven superiority to passwords in security and user experience. Many organizations recognize this and want to take their infrastructure to the cloud while also implementing a certificate-based solution. Unfortunately, Microsoft AD environments are having a hard time making the transition due to their attachment to on-premise hardware.
Many admins looking to transition to Azure are often unsure how to implement a PKI on the cloud, or if its even possible. In this article were going to see if AD CS is a viable solution for a certificate solution in the cloud.
Active Directory Certificate Services (AD CS) is a Windows server designed to issue digital certificates. According to Microsoft, AD CS is the Server Role that allows you to build a public key infrastructure (PKI) and provide public key cryptography, digital certificates, and digital signature capabilities for your organization.
Its important to note, AD CS isnt technically a PKI; it provides a platform to build and implement a PKI. Certificates need a PKI to operate; however, admins may want to hold off on building their own PKI with AD CS because its restrictive and expensive.
AD CS can only be run on-premise, which is not ideal for an industry quickly moving to the cloud. The restrictive nature of AD CS restricts admins from choosing their own infrastructure and prevents many environments from migrating their systems to the cloud.
Luckily, there are cloud-based alternatives. SecureW2s Managed Cloud PKI doesnt require extra hardware to set up (since its all on the cloud), can be set up in less than an hour, and comes with tons of certificate automation features that make issuing and managing certificates significantly easier and more cost-effective.
If youve been considering switching from your Active Directory to Azure AD, SecureW2 is the only vendor that empowers organizations with 802.1X authentication using Azure AD.
With SecureW2, your organization no longer needs to be held back from going to the cloud because you have on-prem AD CS hardware. Our services are easy to use and can allow you to adapt your infrastructure to the cloud in no time.
One of the biggest hurdles with certificates is how IT admins can get a certificate onto every user device, especially now that nearly every employee has multiple devices. By integrating AD CS with SecureW2, admins can automatically configure both BYODs and managed devices for 802.1X settings and equip them with certificates.
AD CS admins can deploy SecureW2s onboarding software to automate certificate enrollment and 802.1X configuration. Our automated services relieve admins from manually configuring every BYOD for a certificate. Plus, end users have a far better experience because all they need to do is press a few buttons and their devices handle the rest.
AD CS admins can also integrate their MDMs, like Microsoft Intune, with SecureW2 to securely distribute certificates to every managed device. Using our Management Portal, admins can set up powerful Certificate Auto-Enrollment Gateway APIs so they can send out payloads containing 802.1X configuration settings. Every managed network device can self-service themselves with a certificate. Check out our page on using SCEP to enroll EAP-TLS certificates with Intune.
Garnering your network cloud based 802.1X certificate security is a sure-fire way to ensure your organizations data remains secure. Combining AD CS and SecureW2 is the best way to distribute and manage your certificates. Our Cloud PKI and Cloud RADIUS are cheaper, more versatile and easier to use then any on-premise alternative. Contact us here for more information on how to get started!
The post Does AD CS Work in the Cloud? appeared first on SecureW2.
Recent Articles By Author
*** This is a Security Bloggers Network syndicated blog from SecureW2 authored by Eytan Raphaely. Read the original post at: https://www.securew2.com/blog/does-ad-cs-work-in-the-cloud/
Read more:
Does AD CS Work in the Cloud? - Security Boulevard
SKT Unveils its AI Chip and New Plans for AI Semiconductor Business – HPCwire
Nov. 25, 2020 SK Telecom (SKT) today unveiled its self-developed artificial intelligence (AI) chip named SAPEON X220 and shared its AI semiconductor business vision.
SAPEON X220 is optimally designed to process artificial intelligence tasks faster, using less power by efficiently processing large amounts of data in parallel. Its deep learning computation speed is 6.7 kilo-frames per second, which is 1.5 times faster than that of Graphics Processing Units (GPUs) for inference that are being widely used by AI-service companies. At the same time, it uses 20% less power than GPU by consuming 60 watts of energy and is about half the price of a GPU.
SKT explained that SAPEON X220 will enable the provision of high-quality AI services by enhancing the performance of AI data centers through speedy computation of massive amounts of data.
From next year, the company will apply SAPEON X220 to its AI service NUGU to improve the voice recognition capability. SAPEON X220 will also be utilized by SKTs affiliate companies. For instance, ADT Caps will apply the chip to enhance the performance of its AI-based video monitoring service named T View.
In addition, SAPEON X220 will be applied to the cloud server of the next-generation media platform of Cast.era, a joint venture of SKT and Sinclair Broadcast Group.
SKT will also provide the AI chip to enable AI-based projects promoted under the Korean New Deal Initiative put forth by the Korean Government.
The company aims to generate synergies by combining AI semiconductor chips and 5G edge cloud. The application of an AI chip to a cloud located close to the end users will enable the provision of high-quality AI services with ultra-low latency regardless of customer device.
Moreover, the company said that it will go beyond just providing the AI chip to data centers by actively promoting the AI as a Service (AIaaS) business. It will offer a complete solution package as a service by combining its AI chip and AI software, including diverse AI algorithms for features like content recommendation, voice recognition, video recognition and media upscaling, along with Application Programming Interfaces (APIs).
For instance, an over-the-top (OTT) service provider wanting to adopt an AI-based content curation service will be able to easily implement the service by simply using SKTs solution that comes with all that is needed, from high-performance data center applied with AI chip to AI-based software like content recommendation algorithm and APIs.
Meanwhile, SKT is also playing a leading role in Koreas efforts to secure differentiated competitiveness in AI semiconductor technology. The industry-academia-research consortium led by SKT is currently carrying out a large-scale project assigned by the Ministry of Science and ICT to develop next-generation AI chips and interfaces that can be utilized for high-performance servers like cloud data centers. The consortium members include SK Hynix, Seoul National University and Korea Electronics Technology Institute. SKT brings to the consortium its capability in AI processor core and is working closely with SK Hynix, the worlds second-largest memory chipmaker, in the area of memory technology to promote advancement of AI semiconductor technology.
Originally posted here:
SKT Unveils its AI Chip and New Plans for AI Semiconductor Business - HPCwire
Calculating the Total Cost of Hybrid Cloud – Data Center Knowledge
Managing cloud costs is hard enough when you use a straightforward public cloud architecture. It gets even harder when you move to a hybrid environment that integrates public cloud services with private infrastructure. And dont forget to add data center colocation to the mix.
With that reality in mind, heres a guide to calculating total cost of ownership (TCO) for hybrid cloud architectures that run inside a colocation data center.
Related: Hybrid Cloud: The Benefits of NOT Going All-In
The most obvious expense associated with a colocated hybrid cloud is the cost of the software that you use to build and manage your cloud.
For many organizations today, that software will come in the form of a platform like AWS Outposts, Azure Stack, or Google Anthos, which make it possible to run public cloud services and management tools on private infrastructure.
Related: Everything You Need to Know About Colocation Pricing
At a basic level, each of these services uses the same pricing structure. The vendors charge based on the number of virtual CPUs (vCPUs) that customers run within their hybrid environment. Outposts is somewhat different in that it is priced based on compute instance types, but this is more or less a proxy for vCPUs.
If you use Anthos and Outposts, you can save some money by paying upfront or committing to a monthly subscription. Azure Stack pricing includes only a pay-as-you-go option.
A less obvious cost associated with hybrid cloud platforms like those described above are the extra fees youll pay for interactions between your private infrastructure and the public cloud.
Those fees include things like data egress charges that public cloud vendors assess when you move data from their clouds into your own storage media. They generally apply even if you are using an environment based on a platform like Azure Stack or Outposts. API calls to public cloud storage services usually incur a fee, too.
On top of this, some hybrid cloud services charge separate fees even for storage that you host yourself. Azure charges fees for managing your disks within a hybrid cloud, for example.
Its easy to overlook the costs of things like data egress and API fees in the public cloud. It may be even easier in hybrid environments, where you might assume that these fees are built into the basic cost of the hybrid cloud software that you use. Generally, theyre not.
The cost of the servers that you use to host your hybrid cloud is another significant factor in your hybrid cloud TCO.
If you use Azure Stack, Anthos, or most other hybrid cloud platforms (like Eucalyptus), youll need to supply your own servers. The cost of doing so with Azure Stack is likely to be higher than with other platforms, because Azure Stack works only with certified hardware. That means users may not be able to use servers they already own to build a hybrid cloud based on Azure Stack. It also means they will have fewer purchasing options. Anthos and Eucalyptus arent subject to these restrictions; they work with any type of modern server.
Hardware costs for AWS Outposts are bundled into the cost of the Outposts platform, because AWS supplies the servers (which is why Outposts costs thousands of dollars per month for each server, whereas the other hybrid cloud platforms charge only dollars per month per vCPU). This makes hardware costs for AWS more straightforward and less variable. They may be high, but at least you know exactly what youre going to pay for hardware before you commit to Outposts.
When you run a hybrid cloud inside a colocation data center, colocation costs are another key expense that contributes to your TCO.
Calculating these costs can be difficult, because the specifics of colocation pricing vary from provider to provider. You may need to pay for resources like electricity and network service as separate costs, or they may be built into your colocation bundle.
You will also pay your colocation provider for the network links between your and your cloud providers infrastructure. Those costs alone are almost never a straight-forward calculation.
Thus, theres no simple way to determine how much colocation will add to your hybrid cloud TCO. But whatever the details of your colocation plan, the costs are likely to be significant, so youll want to assess them carefully before committing to a colocated hybrid cloud.
The final factor to consider in hybrid cloud TCO is the cost of deploying, managing, and supporting your hybrid cloud environment.
These expenses will vary depending on which platform you use. Theyre likely to be lowest in the case of AWS Outposts, which is a fully managed service, with minimal deployment or upkeep effort required on the part of customers.
Azure Stack and Anthos leave more up to the user. That said, the fact that these platforms for the most part use the same management tooling as the public clouds with which they are associated means that, if you already know how to use those tools, you wont face a steep learning curve when adjusting to hybrid cloud management.
One advantage of using colocation data centers to host your hybrid cloud is that you may also be able to obtain management and support services for the hybrid environment from the colocation provider. AppScale, which sells a hybrid cloud framework based on Eucalyptus, is partnering with some colo providers around support services, for example. But for now, colo packages that bundle hybrid cloud management with colo space and services are the exception.
When it comes time to determine how much a hybrid cloud will cost you, there are a variety of factors to consider. Hybrid cloud software and infrastructure are the most obvious, but its critical to include several other types of expenses as well when calculating hybrid cloud TCO.
Visit link:
Calculating the Total Cost of Hybrid Cloud - Data Center Knowledge
Cloud Server Market Research Report: Overview With Geographical Segmentation By Revenue With Forecast 2026 – Cheshire Media
The researchoffers a comprehensive analysis of the Cloud ServerMarket. Bringing out the complete key insights of the industry, the report aims to provide an insight into the latest trends, current market scenario, and technologies related to the market. In addition, it helps the venture capitalists to understand the revenue opportunities across different segments to make better decisions. Global Cloud Server market provides a detailed report which covers market analyses before COVID19 & opportunities after this pandemic. With COVID-19 pandemic, many industries are transforming rapidly. The Global Cloud Server Market is one of the major industries undergoing changes. This year many industries have vanished entirely from the market and many industries have risen.
Moreover, the government-backed schemes throughout the globe are offering many advantages to businesses. As the governing bodies are supporting the industries, it be a strong pillar to support the market growth of Cloud Server in the upcoming decade (2020-2026). Organizations planning to move into new market segments can take the help of market indicators to draw a business plan. With the technological boom, new markets are blossoming across the globe, making it a breeding ground for new businesses.
Request for Sample with Complete TOC and Figures & Graphs @ https://www.in4research.com/sample-request/3178
Global Cloud Server Market 2020: Covering both the industrial and the commercial aspects of the Global Cloud Server Market, the report encircles several crucial chapters that give the report an extra edge. The Global Cloud Server Market report deep dives into several parts of the report that plays a crucial role in getting the holistic view of the report. The list of such crucial aspects of the report includes company profile, industry analysis, competitive dashboard, comparative analysis of the key players, regional analysis with further analysis country wise.
Global Cloud Server Market Analysis by Key Players:
Moreover, one of the uniqueness in the report is that it also covers the country-level analysis of the regulatory scenario, technology penetration, predictive trends, and prescriptive trends. This not only gives the readers of the report the actual real-time insights but also gives country-wise analysis, that plays a vital role in decision making. The inclusion of the report is not limited to the above mention key pointers. The report also emphasizes on the market opportunities, porters five forces, and analysis of the different types of products and application of the Global Cloud Server Market.
The report splits by major applications:
Then report analyzed by types:
Any questions or want to Customization on this report, just speak with analyst @ https://www.in4research.com/speak-to-analyst/3178
Global Cloud Server Market Report is a professional and in-depth research report on the worlds major regional market conditions of the Cloud Server industry, focusing on the main regions and the main countries as Follows:
COVID-19 Impact on Cloud Server Market:
The outbreak of COVID-19 has brought along a global recession, which has impacted several industries. Along with this impact COVID Pandemic has also generated few new business opportunities for Cloud Server Market. Overall competitive landscape and market dynamics of Cloud Server has been disrupted due to this pandemic. All these disruptions and impacts has been analysed quantifiably in this report, which is backed by market trends, events and revenue shift analysis. COVID impact analysis also covers strategic adjustments for Tier 1, 2 and 3 players of Cloud Server Market.
Get Brief Information on Pre COVID-19 Analysis and Post COVID-19 Opportunities in Cloud Server Market @ https://www.in4research.com/impactC19-request/3178
Table of Contents Includes Major Pointes as follows:
Read more here:
Cloud Server Market Research Report: Overview With Geographical Segmentation By Revenue With Forecast 2026 - Cheshire Media