Category Archives: Cloud Servers
A new, long-term support Java release is just around the corner. As Java passes the 25-year mark, let's step back and look at some of the reasons why Java remains the best programming language for modern software development.
Java also has a very rigid and predictable set of rules that govern code structure. This contrasts starkly with other, untyped scripting languages where it seems like anything goes. When you try to master a new programming language, a clear set of consistently enforced rules makes learning easier.
Furthermore, when things don't make sense, programmers new to Java can find a strong support network of YouTube videos, websites like StackOverflow and online forums like CodeRanch to find answers to a multitude of questions.
The Java API is extensive. The standard JDK comes with over 200 built-in packages containing Java APIs that allow for everything from parsing XML to translating between time zones. When developers add on the Jakarta EE APIs, they have an even more extensive library of APIs that allow for the development of complex, middle-tier applications and cloud-native microservices.
But the rich ecosystem of Java APIs extends far beyond those sanctioned by Oracle or provisioned through Jakarta.
For data persistence, there's the JBoss Hibernate project. For cloud-native microservices development in Java, there's the full suite of Spring Boot APIs. And of course there's a multitude of open source Apache API projects that address a wide range of software development use cases, from aggregating error messages with log4j to solving complicated problems with HashMaps and fail-safe Iterators through the Apache Commons Collections API.
The rich set of Java APIs available to developers aid in the development of strong, error-free applications.
The application development landscape is filled with software development tools written in Java -- by Java developers -- that are designed to simplify and streamline the development, deployment and even decommissioning of Java applications. A few examples of tools written in Java include:
This is by no means an exhaustive list of Java's tool ecosystem. Other examples of tools and technologies written in Java include application servers like Tomcat to Red Hat's popular, Kubernetes-native Java stack named Quarkus.
Few other programming languages enjoy the same level of tooling support that Java does, which helps cement itself as the best programming language.
Android is the world's most popular mobile phone OS and Java is the de facto programming language for Android application development.
While Android's version of Java isn't exactly the same as what you'd find on the JDK, Google did copy over 11,500 lines of code from the Java Standard Edition when they built their Java clone. As a result, developers can expect that the version of Java they see on Android is pretty close to the original.
If you know can write Java code for desktop or server-side applications, you'll pick up Android development very quickly. The low-level differences between the JVM and the Dalvik Android Runtime will all be pleasantly abstracted away from you after a brief learning curve. When developers learn Java, the entire Android ecosystem will be at their disposal.
Java evolves slowly, but it does evolve. With over 25 years of evolution, Java has plenty of evolutionary improvements to be proud about.
From the bulletproof modularity system that was delivered as part of Project Jigsaw, to the recently added ability of functional programming in Java with lambda functions, Java continues to implement big changes that the community demands.
Incremental additions in non-LTS releases, such as the the addition of the new Record data typeand garbage collectors for improved memory management, showcase that the JDK is also constantly evolving.
But in the enterprise software development world, backwards compatibility is just as important as adding new features. Java has always made this a top priority for the stewards of the language. Very rarely does a comprehensive update or feature addition create issues with code written against older releases.
I personally compiled and packaged some complicated code that was written 20 years ago, and it ran without any issues on the latest Java 17 release. When enterprise clients know that things won't break when they perform a JVM upgrade, it gives just another reason why they choose to stick with Java as the best programming language.
View original post here:
5 reasons why Java is still the best programming language - TheServerSide.com
Datto Continuity for Microsoft Azure protects MSPs and their clients’ data in the public cloud – Help Net Security
Datto announced the commercial availability of Datto Continuity for Microsoft Azure, a comprehensive Business Continuity and Disaster Recovery (BCDR) solution that protects MSPs and their clients data in the public cloud in the event of malicious ransomware attacks, security breaches, and vendor outages.
Datto Continuity for Microsoft Azure is architected to leverage the secure Datto Cloud to address a critical and unmet need for MSPsthe added protection and recovery of data in the public cloud through multi-cloud replication. This comprehensive data protection, management, and streamlined recovery is delivered at a predictable cost and without the need for MSPs to piece together individual technologies or depend solely on Microsofts data backup services.
An increasing number of Small and Medium Businesses (SMBs) are opting to host their infrastructure and applications in the public cloud, with worldwide end-user spending on public cloud services forecasted to grow 23.1% in 2021 to total $332.3 billion, up from $270 billion in 2020, according to a forecast from Gartner, Inc. This has been a key growth area for MSPs as they support their clients and shift to Infrastructure-as-a-Service (IaaS) models.
Microsoft Azure has been proven to be the leading public cloud provider over other large providers for businesses in the beginning to middle-stages of public cloud adoption, such as MSPs. Azure provides reliable cloud services for building, deploying, and managing intelligent applications. While Microsoft is responsible for the security of the physical hosts, network, and data centers, its Shared Responsibility Model states that the customer is responsible to secure and protect all applications, data, and endpoints contained within Azure. In this model, MSPs carry the responsibility to ensure they have reliable business continuity solutions in place to protect their clients workloads across on-premises servers and the public cloud.
Datto Continuity for Microsoft Azure performs more frequent backups than other solutions and reduces the risk of data loss or prolonged system downtime due to ransomware and public cloud outages. Further, MSPs can now eliminate single-cloud risk by having the ability to restore data from either Microsoft Azure or the immutable, private Datto Cloud as secondary system protection. With this solution, MSPs can offer their clients the scalability and efficiency of the public cloud while feeling confident their clients cloud data is backed up and can be recovered within minutes even if there is an Azure outage.
Purpose-built to help MSPs scale and grow their business, Datto Continuity for Microsoft Azure delivers a predictable pricing model. One easy-to-understand, flat-fee bill accounts for all associated BCDR costs all achieved with zero egress costs for MSPs. This solution creates a durable margin opportunity, eliminates the need to estimate public cloud service costs or handle unexpected charges, and reduces the total cost of ownership for MSPs. In addition, Datto Continuity for Microsoft Azure is cloud-native to enable MSP partners to quickly scale endpoints, while providing high levels of security and reliability.
MSPs have direct access to Dattos award winning 24/7/365 support team and will no longer need to manage multiple vendors or integrate third-party technologies. The flexible turnkey solution requires minimal training and enables MSPs to manage both on-premises and cloud backup workloads from a single multi-tenant dashboard.
The ongoing cyberwar against ransomware underscores the importance of closing the gap in security for MSPs utilizing the public cloud, said Radhesh Menon, Chief Product Officer at Datto. We architected Datto Continuity for Microsoft Azure from the ground up with the key design goals of providing comprehensive protection and rapid recovery, and an extra layer of protection through multi-cloud replication to Dattos private cloud. In addition, Datto is able to deliver predictability on margin to bring confidence to MSPs that their time and investments in hybrid cloud protection are both secure and profitable.
Key features and benefits of Datto Continuity for Microsoft Azure include:
The driving force behind our clients desire to migrate to the public cloud is two or threefold, said Rick Topping, VP of Operations at Ceeva, Inc, an MSP in Pennsylvania and beta customer of Datto Continuity for Microsoft Azure. Companies understand their responsibility to ensure data is available to employees at all times and realize it needs to be kept as safe and secure as possible in todays environment. Our clients sometimes find it difficult to host an on-premises system and receive the same features and functionality available with a cloud-based system. Datto Continuity for Microsoft Azure provides those capabilities. As we help our clients on their journey to the cloud, this solution gives us the same data protection and business continuity that were used to with Datto SIRIS and provides the reliability and trust known with Datto.
Datto Continuity for Microsoft Azure is available immediately in the United States, Canada, UK, Ireland, Australia, and New Zealand through an Early Access Program that includes introductory pricing until December. Global availability with additional integrations and feature enhancements is expected by the end of 2021.
Lszl Marton, chief operating officer of Invitech
The growing need for cost-efficiency, sustainability, and resilience, as well as the increasing number of shared services and outsourcing, all fuel the adoption of cloud-based business solutions in Hungary. Although Hungary is still below the EU average in cloud computing, its use is expanding in almost all business sectors as locals increasingly have good personal experiences to draw on, Lszl Marton, chief operating officer of ICT provider Invitech, tells the Budapest Business Journal.
BBJ: Where is the Hungarian market in the journey between on-site and cloud-based solutions?
Lszl Marton: Market players in Hungary have generated a considerable improvement recently and developed a basic, initial knowledge of cloud services. Undoubtedly, every business procedure has specific fields where cloud frameworks can deliver higher sustainability, resilience, and flexibility. I am positive that almost all large- and medium-sized firms have engaged in some ways with the cloud, if only through their mailing system. Of course, many are going further in adopting a wide range of cloud-based business solutions. Others have no cloud experience at all. I would not call the latter laggards or conservatives; their technological microenvironment and business goals are simply not in line with the benefits of cloud solutions. If we think about the Hungarian assembly unit of a foreign auto parts manufacturer, we can all admit that they are not in urgent need of cloud-based services. E-commerce firms in the retail sector, on the other hand, have already identified its importance, especially in running their online webshops; besides the everyday workflow, cloud computing remarkably facilitates plans for shipping capacities as well. Cloud penetration, in general, is on the right track in Hungary. This is a shared journey where clients, as well as service providers, go side-by-side.
BBJ: What are the main driving factors for migration?
ML: There is an array of different types of capabilities the cloud can deliver for businesses. I would mention improved agility, resilience, and flexibility. When it comes to the latter, we need to spotlight business-related driving factors next to more common technological ones. The cloud, for instance, helps business environments meet market needs more effectively. Businesses can respond faster to potential expansion, and it also assists firms in faster market entry. In addition, it is easier to change or open new sales channels in cloud frameworks. Lets also mention here data and cyber security. Migrating either a partial or entire backup environment to the cloud gives extra safety. We have learned from the pandemic that providing remote workers with access to usual capacities is indispensable, which is much easier via the cloud than via on-premises infrastructure. Just as in the European Union or the United States, business process modeling is becoming increasingly popular in Hungary. And modeling requires capacities. Among the most effective driving factors, however, are AI and self-learning systems being accessible in the cloud. The value of a company primarily depends on its product and service portfolio, but sustainable workflows and environmental awareness also trigger business value. According to a study compiled by consulting firm Accenture, migrating to the cloud through consolidated IT environments cuts CO emissions by at least 80%.
BBJ: How do you differentiate yourself against competitors?
ML: Invitech tries to position itself and its entire infrastructure as a professional mentor and a consultancy firm helping its clients in their migration. Invitech can combine its products and services with those of giants such as Amazon or Microsoft. We strive for this hybrid approach in our strategy.
BBJ: Is there any mistrust among customers in terms of cyber security and data protection?
ML: Despite ISO standards and countless guarantees from service providers, company data assets, client information, and know-how stored outside of the companies premises naturally generates lingering concerns on the market. Some companies in various areas will never migrate to a cloud server. Many other business processes, however, are a perfect fit for the cloud with essential security certifications. Large international public service providers are also facing some sort of mistrust from the fact that customer data is usually stored in servers beyond the customers homeland, often at the other side of the world. Invitech data centers are placed in Hungary, eliminating exposure to international privacy risks.
BBJ: Is education still necessary?
ML: Yes, absolutely. Invitechs strategy is very highly based on educational purposes, which are not necessarily technical, much more businesslike. We must educate our clients in various fields, explaining that migrating business workflows will bring added value to their operation. The technical advantages, in general, have been acknowledged by customers. Education is more about business. Why? Because it is tough to define relevant metrics and KPIs around results. We do not have existing solutions to measure the benefits of the migration of, for example, invoicing and product development systems into the cloud. It is very hard to create standardized measuring systems, as different customers use different KPIs.
BBJ: How do you see future trends?
ML: I think that the future will bring solid development in Software as a Service (SaaS), especially for functionality-based business models. When customers want to develop a CRM system, they still like to contact software manufacturers instead of defining business functionality goals and inviting various service providers in a tender. I think that the future will help migrate relevant functionalities to the cloud too. Another subsequent aspect is that customers will reconsider existing implementations, start thinking about how they can make things better, and identify areas where development is necessary. At the same time, I project that large-scale public service providers will extend further on the market in partnership with local firms as, no doubt, cloud computing remains a good business.
This article was first published in the Budapest Business Journal print issue of September 10, 2021.
MOUNTAIN VIEW, Calif.--(BUSINESS WIRE)--SentinelOne (NYSE: S), an autonomous cybersecurity platform company, today announced its participation as a launch collaborator for Amazon Elastic Kubernetes Service (EKS) Anywhere. Amazon EKS Anywhere is a new deployment option for Amazon EKS that enables easy creation and operation of Kubernetes clusters on-premises, including on virtual machines (VMs) and bare metal servers. The SentinelOne Singularity Cloud extends security and visibility to assets running in public clouds, private clouds, and data centers via a single console, delivering a fully managed security solution for containerized environments.
Amazon EKS Anywhere brings unprecedented flexibility and agility for Kubernetes workloads by offering true hybrid cloud container orchestration, said Guy Gertner, Vice President of Product Management, SentinelOne. The SentinelOne Singularity Platform delivers protection and EDR to Kubernetes and containerized workloads, wherever they are deployed whether on premises or on AWS.
While Kubernetes is popular in the DevOps community, if improperly secured, it presents an attractive target for adversaries. According to IDC1, 98% of companies have experienced a cloud data breach in the last 18 months, with Kubernetes becoming an increasingly popular attack vector for threat actors interested in data theft, cryptomining, and denial of service attacks. SentinelOne mitigates this risk by delivering a cloud security solution with the same level of management and automation as Amazon EKS Anywhere delivers for Kubernetes.
Flexibility and choice are paramount to why we selected AWS. Using SentinelOne to secure AWS, we can prevent incidents in seconds, stopping attacks in their tracks before they can progress, said Jason Spencer, Vice President of Global IT at R.R. Donnelley.
"Containers are one of the fastest growing ways for companies to deploy and manage applications. Customers run their containers on AWS because of its secure, reliable infrastructure and wide breadth of performant services for containers including Amazon EKS, Amazon ECS, and Amazon Fargate," said Bob Wise, General Manager for Kubernetes at AWS. "Regardless of the service used, security is our top priority at AWS. As AWS grows services like EKS and ECS Anywhere, AWS Outposts, and AWS Wavelength, our customers are using AWS to run containerized applications across the cloud, data centers, and at the edge. We're excited by this collaboration with SentinelOne to provide our customers with an additional layer of consistent workload protection across container services and environments."
Singularity Cloud is a single console for multi-cloud management. It enables security teams to manage both Linux and Windows servers in Amazon Elastic Compute Cloud (EC2) and Docker or Kubernetes containers from the same console in which they secure enterprise attack surfaces. SentinelOne agents deliver AI-driven runtime protection, detection, and response at machine speed across an entire hybrid cloud estate, from Docker containers to self-managed and managed Kubernetes services like Amazon EKS, Amazon EKS Anywhere, Amazon Elastic Container Service (ECS), and Amazon ECS Anywhere.
The SentinelOne agent is DevOps-friendly auto-deployed as a single, resource-efficient Kubernetes agent that protects the Kubernetes worker, its pods, and containers without impacting performance or introducing complexity.
SentinelOne is available in AWS Marketplace. For more information on the SentinelOne Singularity Marketplace, visit http://www.sentinelone.com/partners/singularity-marketplace.
SentinelOnes cybersecurity solution encompasses AI-powered prevention, detection, response and hunting across endpoints, containers, cloud workloads, and IoT devices in a single autonomous XDR platform.
________________________1 State of Cloud Security 2021, an Ermetic report based on a funded research study by IDC
Read more from the original source:
SentinelOne Secures Amazon EKS Anywhere with SentinelOne Singularity - Business Wire
IBM is introducing the new IBM Power E1080 server, the first in a new family of servers based on the new IBM Power10 processor, designed specifically for hybrid cloud environments.
The IBM Power10-equipped E1080 server is engineered to be one of the most secured server platforms and is designed to help clients operate a secured, frictionless hybrid cloud experience across their entire IT infrastructure, according to the vendor.
When we were designing the E1080, we had to be cognizant of how the pandemic was changing not only consumer behavior but also our customer's behavior and needs from their IT infrastructure," said Dylan Boday, VP of product management for AI and hybrid cloud. "The E1080 is IBMs first system designed from the silicon up for hybrid cloud environments, a system tailor-built to serve as the foundation for our vision of a dynamic and secure, frictionless hybrid cloud experience."
The new IBM E1080 was designed to introduce several key features including:
IBM is also introducing Power Expert Care, which offers a tiered approach to service including Advanced and Premium Expert Care tiers.
The simple service tiers and pricing facilitate straightforward support options for the IBM Power E1080 server, and additional add-ons such as hardware and software system health checks and regular security updates are designed to ensure that the systems stay protected against the latest cybersecurity threats, while also providing software and hardware coherence and higher systems availability.
For more information about this news, visit http://www.ibm.com.
For several years, Google has allowed users to automatically back up pictures and videos from their phone over to the cloud directly using Google Photos Back up & sync functionality. Although the backup process happens well on its own, users were limited to deleting all the backed-up content manually. This meant that you had to open the Google Photos app and use its Free up device storage option every time you wanted to delete files that you had already backed up.
The company is now providing a new way to free up storage on your phone by letting you delete your backed-up media automatically after a certain period. This comes in the form of an update to the Files by Google app which now has a new Smart Storage option. In this post, well explain what Smart Storage is all about and how you can use it to delete backed-up media from your Android smartphone.
Google has released a new Smart Storage feature for its Files by Google app on Android that allows you to permanently delete media from your phone that has already been backed up to Google Photos. Smart Storage can thus help you clear your phones local storage by deleting media that has already been uploaded to Google Photos cloud servers.
When you turn on Smart Storage, the app will delete all the pictures and videos that have been backed up to Google Photos but are still available on your device locally for more than 60 days after the backup was complete.Because of the 60-day limit, your media doesnt get deleted as soon as youve backed it up but stays on your phone for up to 60 days in original quality before getting deleting.
This is important as many users back up their pictures and videos in reduced quality to upload as much stuff as possible but may need the backed-up media in original quality on their phones for some time so they can share it with others or on social media.
The automatic deletion will also occur if your devices free storage is less than 25% of its total capacity, according to this Google Help page. This means, your backed-up media may get deleted even if theyve been saved on your phone for less than 60 days if your phones storage is running low.
Once your backed-up media has been deleted from the location storage of your phone, you will still be able to access your pictures and video inside the Google Photos app or on the web but it will only be available in your chosen quality of backup.
Smart Storage is a new feature that isnt natively present on Android but is instead only available on the Files by Google app. So, before you proceed to do anything, make sure you have the latest version of the Files by Google from the Google Play Store, first and foremost. Once thats out of the way, you still need to ensure you have the following things:
Once youve made sure you have all that you need to use the new Smart Storage feature, you can start using it on your Pixel device to free up some storage from your phone. To get started, open the Files by Google app on Android and tap on the Hamburger menu button (the one marked by three horizontal lines) at the top left corner.
In the sidebar that appears on the left, select the Settings option.
On the next screen, enable the Smart Storage feature by tapping the toggle adjacent to it under Hidden suggestions.
A prompt will appear asking you whether you wish your backed-up media to be deleted automatically. Here, tap on OK.
The Smart Storage feature will now be enabled and you should now see a notification at the bottom confirming this.
The Files by Google app will now delete your backed-up media on its own when your pictures and videos are at least 60 days or older or if your phones free storage is under 25% of its total capacity.
Thats all you need to know about automatically deleting backed-up media on Android.
Workplace safety can come down to milliseconds. Whether enforcing rules for hard hats or masks, a system responsible for protecting a site needs to issue a noticeable alert almost simultaneously when it detects non-compliance.
Lumen Technologies and IBM developeda solutionthat meets the very low latency requirements of such use cases by processing all data ingesting and analyzing it where and as soon as it is generated. The solution uses video cameras to send images in real-time to a video management server, on which IBM Video Analytics software quickly processes each image, triggering an alert (if needed). Were the system to operate more slowly, a person at risk could already be many steps into the restricted area before being stopped.
Managing video analytics at the edge
Lumen Technologies and IBM built a safety system with a set of three video cameras and two servers. The cameras are linked to one of the servers the video management server which is running the analytics software. This software receives and processes video images, identifies violations of movement rules, and triggers alerts. In production, the number of video management servers increases in proportion to the number of cameras, depending on the ratio that preserves low-latency performance.
Scaling up while rapidly iterating
On a separate server on-site,IBM Edge Application Managerruns incontainersonRed Hat OpenShift for IBM Cloud; its role is to install the most recent version of the analytics software on all video management servers there. As the number of video management servers in a deployment increases, so wouldcontainerizedinstances of the Edge Application Manager and the number of OpenShift worker nodes needed to support them.
Consistent deployments across locations
But what is a deployment of Red Hat OpenShift for IBM Cloud doing in an edge site?
As you have already guessed, the short answer isIBM Cloud Satellite.
In setting up the solution with the video cameras and video management servers in place, a customers operations team first uses IBM Cloud to select hosts at the edge site to server as the Satellite location. Once the location is set up, the team uses the same IBM Cloud console to provision Red Hat OpenShift for IBM Cloud in that new location and deploy the Edge Application Manager in containers to pods onvirtual machinesserving as worker nodes.
And this is the key to scalability for this safety solution. Besides putting video management servers in place and linking video cameras to them, rolling out their solution in new sites is easily accomplished by setting up Satellite locations, provisioning Red Hat OpenShift for IBM Cloud, and deploying the appropriate number of Edge Application Manager instances to worker nodes.
The consistency of software across all locations is ensured through the single view in IBM Cloud, from which cloud services, containerized applications, security, and network policies are monitored and can be managed across public and private environments.
Adapting to emerging needs
Since the video analytics software can be trained to identify any visual pattern and enforce different movement rules related to what is observed, the safety solution is adaptable. For example, with a thermal camera for COVID-19 monitoring, retrained video analytics can allow employers to instantaneously detect employees temperatures. For that same use case, other camera analytics can calculate how many people are using a space and determine when the next deep cleaning is needed.
Continuous security and observability
A single, consolidated view of Satellite locations shows deployments and services running in every location. Teams can manage the network traffic and configure the applications within all locations and provision and use services as if they are working in the public cloud. That also means client teams can even deploy the same application stack to any location from the IBM Cloud catalog.
Satellite Link establishes secure tunnels and enables control of application and service traffic to and from each location. Satellite Link works with your existing network configuration and security postures. Teams in all Satellite locations use the same access and identity management (IAM). With support for a customers own keys and certificates, consistent data encryption enables workloads to span locations securely. Endpoints across the secure tunnels are uniquely and automatically named, yielding fast DNS, predictable operations, and easy compliance audits.
Consistent and portable operations at any scale
Lumen Technologies and IBM built a solution that can perform real-time, intelligent data analysis at thousands of edge sites across high-speed fiber connections to the many Lumen Edge platform locations whereIBM Cloud Satelliteand the Edge Application Manager run. Through a single view in IBM Cloud Satellite, operating the solution is consistent across all hubs and locations. That repeatability is a baseline from which teams can gain velocity in rolling-out deployments, quickly scaling up edge locations with new functionality, and remotely automating many operational chores.
The original article by Briana Frank, director of product management at IBM, is here.
The views and opinions expressed in this article are those of the author and do not necessarily reflect those of CDOTrends.Image credit: iStockphoto/metamorworks
The global video analytics market size is expected to grow at a Compound Annual Growth Rate (CAGR) of 20.4% during the forecast period, to reach USD…
Key factors that are expected to drive the growth of the market are the increasing investments and focus of governing institutions on public safety, need to utilize and examine unstructured video surveillance data in real time, significant drop in crime rate due surveillance cameras, growing need among enterprises to leverage BI and actionable insights for advanced operations, limitation of manual video analysis, government initiatives in adopting emerging technologies to enhance the public safety infrastructure, reduced cost of video surveillance equipment and long term RoI and demand for enhanced video surveillance.
New York, Sept. 15, 2021 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Video Analytics Market with COVID-19 Impact, by Component, Application, Deployment Model, Type, Vertical And Region - Global Forecast to 2026" - https://www.reportlinker.com/p04838914/?utm_source=GNW
The COVID-19 impact on the global video analytics marketThe recent economic slowdown with the impact of COVID-19 emphasizes the need for alternate business systems.It has become important for businesses to embrace cloud computing and migrate to cloud video analytics solutions.
This will help organizations to have a stable business condition in the short term while targeting continued growth and expansion in the long run.The recent COVID-19 crisis has shifted the focus on safety and security of human lives.
Also, the emergence of intuitive technologies mainly the Ai-based surveillance systems based on deep learning and computer vision technologies. The organizations are utilizing video analytics solutions across end user industries due to variety of benefits including dynamically attain situational awareness, proactively drive real time alerting, and scheduling BI dashboard.
Edge-based segment to grow at a higher CAGR during the forecast periodBased on the type, the market is segmented into two categories edge based and server based video analytics.Edge-based segment is expected to grow at a higher CAGR during the forecast period.
Edge-based video analytics evolves with the emergence of new powerful in-build chipsets in cameras that offer higher computational capabilities at the edge.Such systems inform operators on a wide range of real-time video or audio events requiring attention and providing more sophisticated analytics, such as queue management and heat maps that offer new opportunities for business and traffic intelligence.
Advancements in deep learning and its integration with the edge system are expected to drive the adoption in the coming years.Deep learning takes ML to another level based on neural network principles that impersonate the complexity of the human brain.
Earlier, the functionality was mainly available at server-side processes, which would require videos to be decompressed and processed. Edge-based devices need external inputs to learn from before proving as a useful tool to recognize known objects and behaviors.
On-premises segment to account for a higher market share during the forecast period.The video analytics market is segmented by deployment type into on-premises and cloud segment.The on-premises segment account for a higher share of the video analytics market during the forecast period.
This approach is mostly adopted for applications that involve the processing of sensitive and confidential data volumes.These data volumes include internal and external surveillance footage and video feeds of business operations that contain confidential information and crucial insights.
In the on-premises deployment, companies have to install the required hardware parts, such as OS, storage devices, servers, cameras, and routers, as well as video analytics software. Several large organizations are deploying on-premises video analytics due to privacy and security concerns related to confidential data.
Transportation and logistics vertical to grow at a higher CAGR during the forecast period.Transport and logistics is one of the fastest-growing verticals during the forecast period.Video surveillance has become an important part of the transportation and logistics vertical. The various benefits of video analytics for the transportation and logistics vertical are the elimination of overcrowding, behavior analysis, enhanced safety measures, incident recording, and detection of blind spots. Video analytics can contribute to the enhancement and betterment of this vertical for commuters while providing improved safety benefits. The various features offered by video analytics, such as facial recognition, object tracking, unidentified object detection, cargo and train carriage recognition, and intelligent traffic monitoring, can help transportation and logistics companies prevent disasters and detect emerging threats, which may lead to infrastructure destruction or vehicle crashing, resulting in the loss of life.
North America to account for the highest market share during the forecast period.The video analytics market is segmented into five regions: North America, Europe, APAC, MEA, and Latin America.The video analytics report provides insights into these regional markets in terms of market size, growth rates, future trends, market drivers, and COVID-19 impact.
North America is expected to hold the highest market share in the overall video analytics market during the forecast period.Following North America tops the world in terms of the presence of security vendors and the occurrence of security breaches.
Therefore, the global video analytics market is dominated by North America, which is the most advanced region with regard to technological adoption and infrastructure.The growing concerns about the protection of critical infrastructure and national borders have increased government intervention in recent years.
Specific budget allocations, such as the budget for The Department of Homeland Security, and mandated security policies are expected to make North America the most lucrative market for vendors from various verticals. The North American market covers the analysis of the US and Canada. The protection of critical infrastructure is the most serious economic and national security challenge for the governments of both countries. Many governments and law enforcement agencies in the US and Canada are taking initiatives for strengthening their security infrastructure. The US and the Canadian governments are continuously working with law enforcement agencies to prevent violent extremism and counter terrorism-related incidents.
The break-up of the profiles of primary participants in the global video analytics Market is as follows: By Company: Tier 120%, Tier 225%, and Tier 355% By Designation: C-Level Executives40%, Director Level33%, and Others27% By Region: North America32%, Europe38%, APAC18%, and RoW-12%The video analytics Market comprises major providers, such as Avigilon(Canada), Axis Communications(Sweden), Cisco(US), Honeywell(US), Agent Vi(US), Allgovision(India), Aventura Systems(US), Genetec(Canada), Intellivision(US), Intuvision(US), Puretech Systems(US), Hikvision(China), Dahua(China), Iomniscient(Australia), Huawei(China), Gorilla Technology(Taiwan), Intelligent Security Systems(US), Verint(US), Viseum(UK), Briefcam(US), Bosch Security(Germany), i2V(India), Digital Barrier(UK), Senstar(Canada), Qognify(US), Identiv(US), Ipsotek(US), Delopt(India), Drishti Technologies(US), Natix(Germany), DeepNorth(US), Cronj(India), Microtraffic(Canada), Actuate(US), Calipsa(UK), Athena Security(US), Corsight AI(Israel), Arcules(US), Cawamo(Israel), Kogniz(US), and Durac(US). The study includes an in-depth competitive analysis of key players in the video analytics Market with their company profiles, recent developments, COVID-19 developments, and key market strategies.
Research CoverageThe report segments the global video analytics Market by component into two categories:software and services.By deployment model, on-premises and cloud.
By application, the market is segmented into seven categories:incident detection, intrusion management, people/crowd counting, traffic monitoring, automatic number plate recognition, facial recognition, and others.By type, the market is segmented into two categories: server-based and edge-based.
By vertical, the video analytics market has been classified into banking and financial services, city surveillance, critical infrastructure,education, hospitality and entertainment, manufacturing, defense and border security, retail, traffic management, transport and logistics, and others. By region, the market has been segmented into North America, Europe, APAC, MEA, and Latin America.
Key benefits of the reportThe report would help the market leaders/new entrants in this market with the information on the closest approximations of the revenue numbers for the overall video analytics market and the subsegments.This report would help stakeholders understand the competitive landscape and gain insights to better position their businesses and plan suitable go-to-market strategies.
The report would help stakeholders understand the pulse of the market and provide them with information on the key market drivers, restraints, challenges, opportunities, and COVID-19 impact.Read the full report: https://www.reportlinker.com/p04838914/?utm_source=GNW
About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.
Nasuni has invented Global File Acceleration (GFA) a way to sync shared files up to 5x faster and enabled faster ransomware recovery as a by-product, with a demo restoring a million files in under 40 seconds.
The company provides shared access from edge appliances to files stored in the public cloud and file changes created by one user in an edge appliance are synced to other users. Nasunis UniFS filesystem stores files, their data and metadata in public cloud object storage. Continuous Versioning Technology sends changed file data fragments (snapshots) to immutable object storage in the cloud. Lost, deleted or corrupted files can be recovered to any point in time up to the last fragment stored using CVT metadata and data. GFA speeds this process.
Nasunis Chief Product Officer, Russ Kennedy, offered a statement: No other storage or backup vendor can provide Rapid Ransomware Recovery for file servers the way Nasuni can. And now Nasunis high-performance Global File Acceleration service sets us even further apart. Enterprises can solve their file protection, primary file storage and multi-site file sharing challenges all in one solution.
We are told by Nasuni that GFA dynamically performs near-real-time, intelligent analysis of file usage to orchestrate and prioritise data propagation of new files across Nasuni Edge Appliances in all locations. As a result, global users sharing files gain the very fastest access to new data that they need most.
How GFA does this is not revealed. Think of GFA as a way of souping up data movement speed when changed file data, created using CVT, is synced out to edge appliances. Stephen Held, VP and CIO at Nasuni customer LEO A DALY, said: It was already simple to manage and collaborate on our global file shares across 27 locations, and file synchronisation has always been much faster than traditional methods. But with this latest release, the performance is dramatically faster.
The same basic process is used in a ransomware recovery. Say a user at an edge appliance creates a new file. That has to be sucked up to the cloud store by Nasuni and then blasted out to other users at the various edge appliances. This involves sending out the updated file:folder metadata and the data in the file when it is accessed.
Nasunis UniFS software detects a file has been created and takes care of this. It similarly responds to file deletions. We could imagine a ransomware attack as being the equivalent of a mass file deletion the files are unobtainable. So UniFS restores them to a point in time up to a minute before the attack. Its a kind of mass sync exercise in a way and GFS speeds it up.
Up until now UniFS has been able to restore millions of files in minutes. LEO A DALY was hit by ransomware and Held said: Nasuni was a true lifesaver when we got hit by a ransomware attack. Once we contained the attack, we were able to restore files quickly. Our operations hardly missed a beat.
With GFA it is faster still, think seconds, and a demo shows 1,001,233 files being restored in 38.8 seconds.
That actually meant the file metadata was restored, as a look at the Size and Size on disk numbers in the image above shows. The demo then showed restored files being accessed once the recovered file:folder metadata was back in place. It was all smooth and simple.
Nasuni says a survey of its customers who had been hit by ransomware attacks showed none of them paid a ransom. More than one third of them stopped the attack, identified infected files and restored valid versions of them in under an hour. The others presumably took longer. GFA will help more of them break the 60-minute mark when handling a ransomware attack in the future.
WhatsApp has introduced end-to-end encryption for chats backups on iCloud. This means that even if you back up your messages and media to Apples cloud servers. They will be protected by End-to-End encryption so users can rest assured their private conversations are safe no matter where they go or who gets hold of them.
WhatsApp messages stored in iCloud are still not protected by end-to-end encryption. WhatsApp is aiming to introduce a new feature that will allow users the option of password protection for their chats before uploading it onto Apples cloud platform.
This way, you can ensure that no one but yourself has access and thus avoid any verifiable information being compromised in case it gets hacked or otherwise accessed by an outsider!
Encrypted chat backups are set to become available soon and will be rolling out several weeks from now. The encryption key will make all of your backups secure in remote iCloud servers by ensuring they cannot be read without a password.
The 64-bit encryption key or password will be optional and saved in the users account so they can recover their data if needed.
The encrypted chat backups feature is coming to Android (for WhatsApp users backing up their chats) and iOS in the next few weeks.
See the original post here:
WhatsApp new feature will allow you to securely back up your chats in iCloud - Thewistle