Category Archives: Cloud Servers

Best ZenMate Alternative in 2022 [5 VPNs Similar to ZenMate] – Cloudwards

ZenMate is a VPN service that offers decent speeds and various security protocols. However, the feature set isnt impressive, especially when compared with major VPN services like ExpressVPN. In this guide, we help you select the best ZenMate alternative.

The downfalls of ZenMate VPN is not just the lack of features, though. The website has some conflicting information. All pages, including the checkout page, tout a 30-day money-back guarantee. However, the refund policy states that subscribers only have a 14-day window to claim a refund.

If youre looking for a better alternative, we have you covered. Below, we provide an overview of five of the best VPN services as potential ZenMate VPN alternatives.

ZenMate VPN has a free browser extension that allows you to use a VPN connection only for the browser. It also comes with several restrictions, like access to a maximum of four server locations for IP addresses and a top speed of 2 Mbps. However, the extension is only available on the paid version.

ZenMate is an average VPN, and better options exist. For only a few extra cents per month, you can get a ton of extra features and much better performance.

You can get the ZenMate VPN for free by using its free browser extension. The extension comes with limited functionality and offers a maximum speed of 2 Mbps and access to four server locations.

We did some legwork to find the best ZenMate VPN alternatives, and here are the five services we found:

ZenMate offers OpenVPN and IKv2 combined with AES-256 encryption. Thats an excellent combination for online security, but other VPNs also offer newer protocols like WireGuard. ZenMate fails to offer more options outside of OpenVPN and IKEv2.

With ZenMate, you also dont get split tunneling, an increasingly common feature. While the VPN will help you access major streaming services like Netflix and Amazon Prime Video, you might need to switch between servers to find one that works.

ZenMates pricing isnt all that great given its feature set. For only a few cents more, you can get much better performance from other low-cost VPNs like Surfshark or CyberGhost. We talk in greater detail about ZenMate in our ZenMate review.

We looked at ZenMates shortcomings and picked the best alternatives. We give you an overview of the top picks below.

ExpressVPN is the best VPN for streaming.

More details about ExpressVPN:

Pros:

ExpressVPN is hands down the best VPN service currently available in the market. It performs well across the board in terms of streaming, connection speeds, torrenting and security.

You get several security protocols to choose from, including OpenVPN, IKEv2, IPsec, PPTP, L2TP and Lightway ExpressVPNs proprietary protocol that optimizes performance. With ExpressVPN, you also get split tunneling, a feature thats not available on ZenMate. ExpressVPN also offers all the essentials like a kill switch and DNS leak protection.

If you want to view geoblocked content on your gaming console or a Chrome OS device, ExpressVPN has an app for those devices and many more, while ZenMate doesnt. You can still view geoblocked content on those devices as a ZenMate user, but youll need to take the tedious route of first installing the VPN on a router.

ExpressVPN is the best VPN for streaming. It gets into most streaming platforms, including Netflix, Amazon Prime Video, BBC iPlayer and Hulu. In addition to easily getting past geoblocks, ExpressVPN also offers excellent download speeds. You wont experience any buffering, even when youre streaming in 4K.

You also get a feature called MediaStreamer that spoofs your location without additional encryption. Your VPN connection may become less secure and you wont have an option to choose an IP that belongs to a specific country. However, youll enjoy faster connection speeds and better streaming performance. Learn more about ExpressVPN in our ExpressVPN review.

You can try ExpressVPN risk-free using the 30-day money-back guarantee.

ExpressVPN is the best among major VPN providers and is priced as such. The one-year plan is its best value. If you dont mind spending a little extra to get the best VPN service, ExpressVPN is the perfect choice. If youre not entirely confident, try it out risk-free using the 30-day money-back guarantee, and if you dont like it, you can claim a full refund.

NordVPN is the fastest VPN service out there.

More details about NordVPN:

Pros:

NordVPN is a very popular VPN service and one of the best alternatives to ZenMate. With over 5,500 VPN servers across 59 countries, NordVPN lets you acquire IP addresses for all popular locations, so you can access other countries content libraries on streaming platforms.

Speaking of streaming, NordVPN gets into all major platforms. More importantly, NordVPN is the fastest VPN as per our speed tests. Fast speeds translate to a pleasant, buffer-free streaming experience.

NordVPN fills in all the gaps in ZenMates feature set. For example, you get split tunneling with NordVPN, and it offers three security protocols: OpenVPN, IKEv2 and NordLynx (NordVPNs proprietary protocol inspired by WireGuard). You also get to choose between two ciphers AES-256 and ChaCha20 (when you use NordLynx).

The NordVPN app comes with a kill switch, offers DNS leak protection and has excellent security protection. More importantly, NordVPN provides several types of VPN servers to increase your online security.

NordVPN offers four types of specialty servers: double VPN, P2P, Onion over VPN and obfuscated servers. Connecting to a specialty VPN server translates to an even more secure network.

Double VPN is the most commonly used type of the four, in which your internet traffic goes through two secure servers, adding an extra layer of security. Learn more about NordVPN in our comprehensive NordVPN review.

NordVPN offers great value for the money given the feature set and performance.

Considering how NordVPNs feature set and performance are similar to ExpressVPN, the two-year plan is its best value. You can still try it out before committing for the long term, using its 30-day money-back guarantee to see if it fits your needs.

Surfshark offers unlimited simultaneous connections like ZenMate.

More details about Surfshark:

Pros:

Surfshark is a feature-rich VPN that also allows unlimited simultaneous device connections, just like ZenMate. As with most major VPN services, Surfshark has all the essentials including a kill switch, split tunneling, unlimited devices and a large network of servers of over 3,200 servers across 95 countries.

Even though Surfshark arrived to the market after most VPNs on the list, it has managed to build a strong feature set. You get an ad blocker called CleanWeb that blocks ads on the browser and other apps, and a feature called NoBorders mode that enables users to bypass restrictions in censorship-heavy countries like China.

With Surfshark, you can also access geoblocked platforms more easily than you can with ZenMate. You wont need to switch between servers to find one that works for a particular streaming platform.

Unlike ZenMate, Surfshark offers several security features such as double VPN (called Multihop on Surfshark) and IP address rotation. You can use double VPN by selecting one of the 12 predefined pairs of server locations to make your connection more secure.

Surfshark offers a range of additional features like CleanWeb and NoBorders mode.

The IP address rotation feature automatically changes your IP address every few minutes. If you need even more features, you can get Surfsharks security package called Surfshark One as an add-on. The app also has the Camouflage Mode, which ensures that not even your ISP can detect the VPN.

Surfshark is more affordable than ExpressVPN and NordVPN, especially when you consider you have unlimited devices. You can also try it out with its 30-day money-back guarantee. Find out more about Surfshark in our comprehensive Surfshark review.

Surfshark Plans

CyberGhost has one of the largest server networks, with over 9,000 servers.

More details about CyberGhost:

Pros:

CyberGhost offers a massive server network with over 8,900 servers spread across 91 countries. You dont get as many security protocols to choose from as with other services, but you still get one more than ZenMate. CyberGhost offers OpenVPN, IKEv2 and WireGuard with AES-256 encryption.

CyberGhosts apps for desktop and mobile devices have a quick-connect button on the main screen and a list of servers. The app shows you the distance between your physical and server location, as well as the server load.

The information is especially helpful in CyberGhosts case, given that speed isnt the services strongest suit. The interface also makes selecting a server easier by allowing you to sort them by distance or server load.

None of the major services listed here offer anything in the way of automation. However, CyberGhost lets you automate VPN-related tasks using smart rules. For example, you can set a rule for connecting to a specific server location automatically whenever you launch the VPN app. Learn more about CyberGhosts features in our CyberGhost review.

CyberGhost is the most pocket-friendly ZenMate alternative on the list.

CyberGhost is the lowest priced service on the list with its three-year plan, its best value. It also offers a generous 45-day money-back guarantee on all annual plans. You can claim a full refund if youre not fully satisfied.

Windscribe is the best free ZenMate alternative.

More details about Windscribe:

Pros:

Cons:

Windscribe is a free ZenMate alternative. In most cases, we dont recommend using free VPNs. Theyre generally slow, unreliable and even insecure, unless theyre on our best free VPNs list and Windscribe is at the top of that list.

Even on the free version, you get top-notch connection speeds and no bandwidth cap. Like ZenMate, Windscribe also offers unlimited simultaneous connections. The free plan gives you access to servers in 10 countries and 10GB of monthly data once you confirm your email.

Windscribe also offers a powerful browser extension called R.O.B.E.R.T. However, the free version only allows you to set three custom access rules. If you want to set more than three custom access rules to block domains or IPs, youll need the paid version. Learn more about R.O.B.E.R.T. and other Windscribe features in our thorough Windscribe review.

Not many free VPNs can help you access Netflix U.S., but Windscribe can. It can also unblock other major platforms. Since Windscribe offers fast connection speeds, you wont need to worry about buffering even when youre on the free plan.

Windscribe offers flexible pricing where you can choose to add server locations.

Windscribe also has a premium version with two pricing models: build-a-plan or the time-based plan. The build-a-plan option offers more flexibility because you can choose your server locations for a dollar each and get access to unlimited data for another dollar. The build-a-plan option has a minimum checkout value of $3 per month.

Pro Plan

ZenMate has a no-logging policy, and for the most part, it should ensure your privacy online. However, its based in Germany, which is a Fourteen Eyes alliance member. If youre looking for airtight privacy, you should consider a different service like ExpressVPN or NordVPN.

ZenMate is an average VPN at best. If youre looking for top-notch security and superior performance, there are better options.

The best VPN services offer various security protocols, can easily unblock streaming platforms and come with a bunch of extra features like browser extensions such as ExpressVPN. You also dont have to shell out a lot of extra money either when you pick the right alternative to ZenMate with NordVPN and other VPNs on this list.

Have you switched to any of the ZenMate VPN alternatives? What was your experience like? Let us know in the comments below, and as always, thank you for reading.

Let us know if you liked the post. Thats the only way we can improve.

YesNo

Read more here:
Best ZenMate Alternative in 2022 [5 VPNs Similar to ZenMate] - Cloudwards

Beyond Shadow IT: Expert Advice on How to Secure the Next Great Threat Surface – Security Intelligence

Youve heard all about shadow IT, but theres another shadow lurking on your systems: Internet of Things (IoT) devices.

These smart devices are the IoT in shadow IoT, and they could be maliciously or unintentionally exposing information. Threat actors can use that to access your systems and sensitive data, and wreak havoc upon your company.

A refresher on shadow IT: shadow IT comes from all of the applications and devices your employees use without your knowledge or permission to get their jobs done and handle their work data. Some examples of shadow IT include departments purchasing and installing their own software, users making unauthorized changes to their endpoints and employees using cloud services that arent company standard.

Add a few IoT devices into the mix, and your security efforts are suddenly and obviously more vulnerable. However, whats not as obvious is that the shadow IoT phenomenon can include things like multicolored light bulbs, coffee makers and Bluetooth speakers.

These devices pose new security risks for the enterprise, as IoT is typically not as secure as it should be. In 2021, 12.2 billion devices connected to the internet worldwide, with an expected growth up to 14.4 billion active connections in 2022. If you think none of those devices are shadow devices on your network, think again. According to Infoblox, 35% of U.S., UK and German companies have more than 5,000 shadow devices connected to their network on any given day.

TikTok personality and security engineer Jose Padilla (@secengineer) knows how to see which devices might be at risk. His frequent TikTok posts test different IoT devices to determine just how risky they are and examine what kind of network traffic the devices are outputting.

The Mirai botnet was created almost entirely by IoT devices, he said. Thats what inspired me to start looking more into what these IoT devices are doing on my network. Of course, I want to use smart things. Theyre very convenient. I obviously love technology. But as a security engineer, I always have to second guess these kinds of things.

Padilla has tested almost two dozen devices and explains that he takes each through a rigorous process that requires at least three or four hours of sifting through logs to establish patterns to see if anything stands out.

What surprised Padilla most from his testing is the security issues arising from something as simple as a smart lightbulb. You can watch his video for more detail, but we wont name the product here.

Its such a well-known brand; a premium IoT brand, he said. I expected it to go completely smooth and be boring, and it definitely wasnt boring.

Padilla explained that the traffic generated from the smart bulb would raise serious red flags for any security team.

Here are the highlights of what he discovered:

What concerned Padilla the most was a vulnerability that, if exploited, could unleash significant damage.

One of those things that I found was the authentication sessions, he said. The authentication sessions are the connection between the companys cloud servers and the bulbs smart hub itself. So if you wanted control from the cloud, this is the connection thats going to do it.

Plus, he had this feature turned off in his tests, but the hub was still connecting to the cloud. All relevant tokens the single sign-on token, the session token and the authentication token were transmitting data in the clear.

According to Padilla, a similar bug or vulnerability was found in another of the manufacturers products a few years ago, a smart air filter, but was quickly fixed.

Theres no excuse for IoT devices to send traffic over the Internet unencrypted, he said. Its just opening up more risk. Its another threat vector, whether it will be easy to exploit or not.

While most of the attacks that could be potentially launched against the lightbulb are benign, there are proofs of concept that should raise eyebrows.

Weve seen that some light bulbs can have a faster flicker rate, and one potential attack could produce a strobe light effect, he said. It could be harmful to anyone thats photosensitive. But those are more minor in comparison to some of the other attacks or vulnerabilities and proof classes Ive seen for this lightbulb.

Padilla explains that security testers were able to upload malicious firmware to the light bulb, and it was not difficult for them to control the light bulb and force an unsuspecting user to connect to a bad bulb. The attack chain would go from the bulb to their phone to the hub.

The proof of concept demonstrates the kill chain that can happen from just controlling one device, he said. Its not just turning a light on or off; it can go from there to either running code on your phone or the hub and it can get your network to trust those two devices. From there, the skys the limit.

Preventing threats resulting from shadow IoT is never easy. After all, shadow IT and shadow IoT are so named because IT teams are in the dark. But, like everything in cybersecurity, good cyber hygiene goes a long way. If your organization is already deploying network segmentation, vulnerability scans, pen tests and patch management, youre many steps ahead.

The thing I can advise for organizations wanting to use smart devices is the same thing I suggest for home users: put it on an isolated network and dont allow it to talk to your main network, he said. Treat it as a completely untrusted zone. If the shadow IoT devices are on an isolated network, there should be a safe disconnect.

It should also come as no surprise that the most basic of security basics should be prioritized.

The same importance should be applied to patch management, he said, adding that scanning devices for vulnerabilities via vulnerability assessments and pen tests are also critical.

Finally, for the best protection against shadow IoT, Padilla suggests companies apply principles that align with zero trust.

Whether its shadow IT, shadow IoT or other common threats, users will only have access to the resources they need and only to the devices they should have access to. And shouldnt that be the table stakes for security policy today anyway?

Mark Stone is a Hubspot-certified content marketing writer specializing in technology, business, and entertainment. He is a regular contributor to Forbes Bra...

Continue Reading

Excerpt from:
Beyond Shadow IT: Expert Advice on How to Secure the Next Great Threat Surface - Security Intelligence

Fiber Technology: The Ajax Wired Revolution – Nation World News

las Ajax Innovations They have reached a new level. Unlike the technology of the 1980s that still exists in the security industry, our engineers have developed Fiber, a wired technology of the digital generation, fiber Combines cable reliability with freedom of radio connection, A simple four-core cable connected by fiber exhibits absolutely unusual characteristics.

until 2,000 meter connectivityCompatibility with all types of equipment, 60 hours of operation of a system consisting of 30 devices with a backup battery, advanced tamperingphotographic verification, instant notificationsHassle free installation and remote configuration via an app: Everything that was possible in Wireless Ajax is now available with cable and fiber.

Fiber technology at a glance

On a physical level, the fiber looks like a bus connection: the detectors are connected to a control panel with a four-core cable. However, the real revolution lies hidden at the software level. The fiber communication protocol inherits all the advantages of jeweler radio technology, Since it requires minimum power for bidirectional transmission of data packets. This feature provides more freedom while setting up the system as compared to analog and digital solutions available in the market.

A single fiber line can connect one or dozens of different detectors, sirens and keypads in any combination, The same goes for the MotionCam Fibro detectors. A separate cable is not required to receive photo verification. These broadcast over the same line used by system devices to communicate events. Devices are organized by rooms and groups in the Ajax application, regardless of their physical connection to the hub. This means that the security system has both a physical and a digital architecture.

las Multiple fiber lanes allow cable to be routed in a convenient and safe route And then adjust the systems digital architecture in the PRO app. With this flexibility and a margin length of 2km, it doesnt matter whether youre protecting a multi-storey building or multiple buildings, no need to complicate your system with additional signal amplifiers or power supplies Is.

The acceptable length of fiber line as compared to the average length of cables available in the market.

Fiber communication protocol logic ensures shock protection: even 30 detectors connected to a line will communicate stably when triggered simultaneously. Alarms are broadcast in less than 0.15 seconds.

thanks for the fiber Detectors consume tens of microamps, 100 times less than the market average for wired digital detectors, This low power consumption allows you to connect more devices to one line and enables longer cable connections. It is because of the logic of the protocol that the major characteristics of the jeweler are inherited.

Fiber communication is conducted according to TDMA principle. Each device is assigned a small time slot to exchange data with the hub. The rest of the time, your communication modules are inactive. This significantly reduces power consumption and helps prevent interference even when multiple devices are communicating simultaneously.

These time slots are arranged so that alarms are broadcast immediately. If the detector does not respond, the hub repeats the request twice and then notifies users and the security company about the loss of the device.

The result is exceptional autonomy of the system in case of blackouts. 30 wired Fibra devices with up to 60 hours of 7 Ah backup battery, This is 5 times higher than that required for Grade 2, and meets the requirements of Grade 3.

What are degrees and why is it so important for a security system to have them?

For protect transmitted datauses fiber dynamic key encryption, Hacking and getting the data would require huge computing power and time. In other words, it would be a futile effort. However, encrypted data must also be accessed. Each communication session between the hub and the device begins with authentication: Bookmarks and unique properties are compared. If at least one parameter fails validation, the hub device ignores commands. Therefore, there is no point in manipulating or intercepting the data.

An Ajax security system constantly monitoring your status: The hub automatically determines the connected device as many times as possible by the system. If the detector does not respond, a notification will be sent in 36 seconds, The tension of each instrument is controlled. If the line is damaged and lightning starts, a notification will also be sent. The system immediately detects a short circuit on the line and notifies the monitoring station and users. And when the problem is resolved, there is no need to replace the fuses, as the system automatically restores its operation.

In addition, the next update of OS Malevich will allow the Fibra line to be physically protected so that it does not break. To do this, connect the beginning of the line to one hub terminal and its end to the other. This connection forms a ring topology. In the event of a line failure, Ring is split into two work segments, a security company and users are notified of the incident.

Supports fiber ring connection topology. Systems with hub hybrids can have up to 4 rings.

Fibra has completely inherited the Ajax wireless user experience. App notifications contain comprehensive information: the Event type, date, time, detector name and room, Thanks to the addressability of the devices, it is known what is happening in the system and installation, even if a dozen detectors are connected to a line. Add to this the photographic verification of the MotionCam motion detector, and you can have the most of system information and control at your fingertips.

To simplify the installation of fiber-containing equipment, special software tools have been developed. The Line Scan function automatically finds devices not assigned to the Hub and allows them to be added to the system, with just a few taps.

Fiber line scanning interface in Ajax Pro application.

Built-in maximum load testing on power consumption helps to ensure that the system has enough power in any given scenario. After setting up the system, the installer uses the PRO app to simulate the maximum possible energy consumption: detectors sound an alarm, keypads activate and sirens turn on. If the system passes the test, it will have enough power in any case.

If you a. make use of Integrated Pro desktop application for monitoring Ajax systems, The list of convenient features is complemented by automatic system synchronization. After the installation and configuration process, all Ajax Team configurations are automatically downloaded from the cloud servers. The control station engineer wastes no time manually moving the equipment, eliminating mechanical errors.

Fiber is synonymous with flexibility, reliability and control, As part of the Ajax ecosystem, new wired technology makes it possible to have a stable user experience regardless of the connection between devices and the hub. Efficient hardware and sophisticated software laid the foundation for the development of another Ajax product line that set a new standard in the security industry.

Read the original post:
Fiber Technology: The Ajax Wired Revolution - Nation World News

Cloud and datacenters start to feel the slowdown amid spiking energy costs – The Register

The datacenter industry may be starting to feel the effects of the economic slowdown, leading to further impacts on IT vendors and other suppliers, according to reports, while operators in the UK in particular are feeling the pain from rising energy costs.

Cloud and hyperscale companies may have seemed less vulnerable to swings in the wider economy, thanks to the growing adoption of cloud services over the past decade. This was especially so during the pandemic, when many businesses were forced to upscale their use of cloud services to ensure staff could continue to work remotely.

But the signs are starting to point to a possible slowdown, with Reuters reporting that Google Cloud, Microsoft's Azure, and Amazon's AWS all showed slower growth in their recent results.

For example, Google Cloud reported $6.3 billion of revenue for Q2 2022, a 35 percent year-on-year increase, but a slower increase than the 44 percent jump it reported for its Q1 results.

In June, analyst outfit TrendForce was predicting that the global server market will grow more slowly in 2022 than in the past, with China's cloud companies Baidu, Alibaba, and Tencent all lowering their procurement this year. It warned that this could also spread to cloud and hyperscale companies in the US, leading to overall server shipments falling.

All the big cloud players have recently extended the life of their servers in order to save on procurement costs. Microsoft announced this month it was extending the life of its machines by two years, which it expected to save it $3.7 billion next year. Google announced in February an extension of its server lifecycle from three years to four, while Amazon said it expected to save a billion dollars in this quarter by running its servers for six years instead of five.

This could be bad news for the IT industry, which had already seen demand falling away on the consumer device side and may now face weakening demand for servers and lucrative components such as memory chips that servers have so far kept buoyant.

However, as The Register reported this month, enterprises are spending more on cloud infrastructure services than ever before, and falling prices for components may leave the hyperscalers well positioned to take advantage of this and add more capacity at a lower cost.

But supply chain issues may still be a complicating factor, with Microsoft's Azure cloud reported to be having difficulty providing enough capacity to meet customer demand last month.

Meanwhile, another factor affecting datacenter operators is the spiraling cost of energy. According to a recent report, datacenter operators in the UK and Ireland have seen their energy bills increase by as much as 50 percent.

In the UK, 57 percent of operators indicated that they are currently spending between 10 and 30 percent of their entire operating costs on electricity, with some paying more and many expecting this figure to hit 40 percent or higher.

The report, from power generation supplier Aggreko, says that this is causing problems because the all-in pricing models adopted by co-location providers mean they are forced to absorb additional costs and price increases.

It was based on a survey of 253 datacenter professionals, with 58 percent of those in the UK reporting that energy bills have had a significant impact on their company's margins.

The report concludes that operators are struggling to remain competitive, particularly as confidence in government support is "tepid" in both the UK and Ireland, and suggests some remedies.

However, it would seem there is still a danger that customers may be asked to pay more to offset some of the rising energy costs. For example, customers of cloud and network provider M247 were hit with a 161 percent hike in charges late last year, with rising energy prices blamed for the increase even then.

Read the original:
Cloud and datacenters start to feel the slowdown amid spiking energy costs - The Register

Highly-Efficient New Neuromorphic Chip for AI on the Edge – SciTechDaily

A team of international researchers designed, manufactured, and tested the NeuRRAM chip. Credit: David Baillot/University of California San Diego

The NeuRRAM chip is the first compute-in-memory chip to demonstrate a wide range of AI applications while using just a small percentage of the energy consumed by other platforms while maintaining equivalent accuracy.

NeuRRAM , a new chip that runs computations directly in memory and can run a wide variety of AI applications has been designed and built by an international team of researchers. What sets it apart is that it does this all at a fraction of the energy consumed by computing platforms for general-purpose AI computing.

The NeuRRAM neuromorphic chip brings AI a step closer to running on a broad range of edge devices, disconnected from the cloud. This means they can perform sophisticated cognitive tasks anywhere and anytime without relying on a network connection to a centralized server. Applications for this device abound in every corner of the globe and every facet of our lives. They range from smartwatches to VR headsets, smart earbuds, smart sensors in factories, and rovers for space exploration.

Not only is the NeuRRAM chip twice as energy efficient as the state-of-the-art compute-in-memory chips, an innovative class of hybrid chips that runs computations in memory, it also delivers results that are just as accurate as conventional digital chips. Conventional AI platforms are much bulkier and typically are constrained to using large data servers operating in the cloud.

A close-up of the NeuRRAM chip. Credit: David Baillot/University of California San Diego

Additionally, the NeuRRAM chip is highly versatile and supports many different neural network models and architectures. As a result, the chip can be used for many different applications, including image recognition and reconstruction as well as voice recognition.

The conventional wisdom is that the higher efficiency of compute-in-memory is at the cost of versatility, but our NeuRRAM chip obtains efficiency while not sacrificing versatility, said Weier Wan, the papers first corresponding author and a recent Ph.D. graduate of Stanford University who worked on the chip while at UC San Diego, where he was co-advised by Gert Cauwenberghs in the Department of Bioengineering.

The research team, co-led by bioengineers at the University of California San Diego (UCSD), presented their results in the August 17 issue of Nature.

The NeuRRAM chip uses an innovative architecture that has been co-optimized across the stack. Credit: David Baillot/University of California San Diego

Currently, AI computing is both power-hungry and computationally expensive. Most AI applications on edge devices involve moving data from the devices to the cloud, where the AI processes and analyzes it. Then the results are transferred back to the device. This is necessary because most edge devices are battery-powered and as a result only have a limited amount of power that can be dedicated to computing.

By reducing the power consumption needed for AI inference at the edge, this NeuRRAM chip could lead to more robust, smarter, and accessible edge devices and smarter manufacturing. It could also lead to better data privacy, because the transfer of data from devices to the cloud comes with increased security risks.

On AI chips, moving data from memory to computing units is one major bottleneck.

Its the equivalent of doing an eight-hour commute for a two-hour work day, Wan said.

To solve this data transfer issue, researchers used what is known as resistive random-access memory. This type of non-volatile memory allows for computation directly within memory rather than in separate computing units. RRAM and other emerging memory technologies used as synapse arrays for neuromorphic computing were pioneered in the lab of Philip Wong, Wans advisor at Stanford and one of the main contributors to this work. Although computation with RRAM chips is not necessarily new, generally, it leads to a decrease in the accuracy of the computations performed on the chip and a lack of flexibility in the chips architecture.

Compute-in-memory has been common practice in neuromorphic engineering since it was introduced more than 30 years ago, Cauwenberghs said. What is new with NeuRRAM is that the extreme efficiency now goes together with great flexibility for diverse AI applications with almost no loss in accuracy over standard digital general-purpose compute platforms.

A carefully crafted methodology was key to the work with multiple levels of co-optimization across the abstraction layers of hardware and software, from the design of the chip to its configuration to run various AI tasks. Additionally, the team made sure to account for various constraints that span from memory device physics to circuits and network architecture.

This chip now provides us with a platform to address these problems across the stack from devices and circuits to algorithms, said Siddharth Joshi, an assistant professor of computer science and engineering at the University of Notre Dame, who started working on the project as a Ph.D. student and postdoctoral researcher in Cauwenberghs lab at UCSD.

Researchers measured the chips energy efficiency by a measure known as energy-delay product, or EDP. EDP combines both the amount of energy consumed for every operation and the amount of time it takes to complete the operation. By this measure, the NeuRRAM chip achieves 1.6 to 2.3 times lower EDP (lower is better) and 7 to 13 times higher computational density than state-of-the-art chips.

Engineers ran various AI tasks on the chip. It achieved 99% accuracy on a handwritten digit recognition task; 85.7% on an image classification task; and 84.7% on a Google speech command recognition task. In addition, the chip also achieved a 70% reduction in image-reconstruction error on an image-recovery task. These results are comparable to existing digital chips that perform computation under the same bit-precision, but with drastic savings in energy.

One key contribution of the paper, the researchers point out, is that all the results featured are obtained directly on the hardware. In many previous works of compute-in-memory chips, AI benchmark results were often obtained partially by software simulation.

Next steps include improving architectures and circuits and scaling the design to more advanced technology nodes. Engineers also plan to tackle other applications, such as spiking neural networks.

We can do better at the device level, improve circuit design to implement additional features, and address diverse applications with our dynamic NeuRRAM platform, said Rajkumar Kubendran, an assistant professor at the University of Pittsburgh, who started work on the project while a Ph.D. student in Cauwenberghs research group at UCSD.

In addition, Wan is a founding member of a startup that works on productizing the compute-in-memory technology. As a researcher and an engineer, my ambition is to bring research innovations from labs into practical use, Wan said.

The key to NeuRRAMs energy efficiency is an innovative method to sense output in memory. Conventional approaches use voltage as input and measure current as the result. But this leads to the need for more complex and more power-hungry circuits. In NeuRRAM, the team engineered a neuron circuit that senses voltage and performs analog-to-digital conversion in an energy-efficient manner. This voltage-mode sensing can activate all the rows and all the columns of an RRAM array in a single computing cycle, allowing higher parallelism.

In the NeuRRAM architecture, CMOS neuron circuits are physically interleaved with RRAM weights. It differs from conventional designs where CMOS circuits are typically on the peripheral of RRAM weights. The neurons connections with the RRAM array can be configured to serve as either input or output of the neuron. This allows neural network inference in various data flow directions without incurring overheads in area or power consumption. This in turn makes the architecture easier to reconfigure.

To make sure that the accuracy of the AI computations can be preserved across various neural network architectures, engineers developed a set of hardware algorithm co-optimization techniques. The techniques were verified on various neural networks including convolutional neural networks, long short-term memory, and restricted Boltzmann machines.

As a neuromorphic AI chip, NeuroRRAM performs parallel distributed processing across 48 neurosynaptic cores. To simultaneously achieve high versatility and high efficiency, NeuRRAM supports data-parallelism by mapping a layer in the neural network model onto multiple cores for parallel inference on multiple data. Also, NeuRRAM offers model-parallelism by mapping different layers of a model onto different cores and performing inference in a pipelined fashion.

An international research team

The work is the result of an international team of researchers.

The UCSD team designed the CMOS circuits that implement the neural functions interfacing with the RRAM arrays to support the synaptic functions in the chips architecture, for high efficiency and versatility. Wan, working closely with the entire team, implemented the design; characterized the chip; trained the AI models; and executed the experiments. Wan also developed a software toolchain that maps AI applications onto the chip.

The RRAM synapse array and its operating conditions were extensively characterized and optimized at Stanford University.

The RRAM array was fabricated and integrated onto CMOS at Tsinghua University.

The Team at Notre Dame contributed to both the design and architecture of the chip and the subsequent machine learning model design and training.

The research started as part of the National Science Foundation funded Expeditions in Computing project on Visual Cortex on Silicon at Penn State University, with continued funding support from the Office of Naval Research Science of AI program, the Semiconductor Research Corporation and DARPA JUMP program, and Western Digital Corporation.

Reference: A compute-in-memory chip based on resistive random-access memory by Weier Wan, Rajkumar Kubendran, Clemens Schaefer, Sukru Burc Eryilmaz, Wenqiang Zhang, Dabin Wu, Stephen Deiss, Priyanka Raina, He Qian, Bin Gao, Siddharth Joshi, Huaqiang Wu, H.-S. Philip Wong and Gert Cauwenberghs, 17 August 2022, Nature.DOI:10.1038/s41586-022-04992-8

Published open-access in Nature, August 17, 2022.

Weier Wan, Rajkumar Kubendran, Stephen Deiss, Siddharth Joshi, Gert Cauwenberghs, University of California San Diego

Weier Wan, S. Burc Eryilmaz, Priyanka Raina, H-S Philip Wong, Stanford University

Clemens Schaefer, Siddharth Joshi, University of Notre Dame

Rajkumar Kubendran, University of Pittsburgh

Wenqiang Zhang, Dabin Wu, He Qian, Bin Gao, Huaqiang Wu, Tsinghua University

Corresponding authors: Wan, Gao, Joshi, Wu, Wong and Cauwenberghs

Excerpt from:
Highly-Efficient New Neuromorphic Chip for AI on the Edge - SciTechDaily

Google: Here’s how we blocked the largest web DDoS attack ever – ZDNet

By Alfa Photo -- Shutterstock

Google Cloud has revealed it blocked the largest distributed denial-of-service (DDoS) attack on record, which peaked at 46 million requests per second (rps).

The June 1 attack targeted one Google Cloud customer using the Google Cloud Armor DDoS protection service.

Over the course of 69 minutes beginning at 9:45 am PT, the attackers bombarded its customer's HTTP/S Load Balancer with HTTPS requests, starting at 10,000 rps and within minutes scaling up to 100,000 rps before peaking at a whopping 46 million rps.

Google says it is the largest ever attack at Layer 7, referring to the application layer the top layer in the OSI model of the Internet.

The attack on Google's customer was almost twice the size of a HTTPS DDoS attack on a Cloudflare customer in June that peaked at 26 million rps. That attack also relied on a relatively small botnet consisting of 5,067 devices spread over 127 countries.

The attack on Google's customer was also conducted over HTTPS but used "HTTP Pipelining", a technique to scale up rps. Google says the attack came from 5,256 source IP addresses across 132 countries.

"The attack leveraged encrypted requests (HTTPS) which would have taken added computing resources to generate," Google said.

"Although terminating the encryption was necessary to inspect the traffic and effectively mitigate the attack, the use of HTTP Pipelining required Google to complete relatively few TLS handshakes."

Google says the geographic distribution and types of unsecured services used to generate the attack match the Mris family of botnets. Mris is anIoT botnet that emerged in 2021that consisted mostly of compromised MikroTik routers.

Researchers at Qrator who previously analyzed Mris' use of HTTP Pipelining explained the technique involves sending trash HTTP requests in batches to a targeted aimed server, forcing it to respond to those request batches. Pipelining scales up rps, but as mentioned by Google, that technique didn't require it to complete TLS handshakes.

Cloudflare attributed the 26 million rps attack to what it called the Mantis botnet, which it considered an evolution of Mris. Mantis was powered by hijacked virtual machines and servers hosted by cloud companies rather than low-bandwidth IoT devices, according to Cloudflare.

SEE: How to find out if you are involved in a data breach -- and what to do next

Google noted that this Mris-related botnet abused unsecured proxies to obfuscate the true origin of the attacks.

It also noted that around 22% or 1,169 of the source IPs corresponded to Tor exit nodes, but the request volume coming from those nodes amounted to just 3% of the attack traffic.

"While we believe Tor participation in the attack was incidental due to the nature of the vulnerable services, even at 3% of the peak (greater than 1.3 million rps) our analysis shows that Tor exit nodes can send a significant amount of unwelcome traffic to web applications and services."

Go here to read the rest:
Google: Here's how we blocked the largest web DDoS attack ever - ZDNet

Global Infrastructure as Code (IaC) Market Report 2022: Advent of Modern Cloud Architecture & Demand for Better Optimization of Business…

Company Logo

Global Infrastructure as Code Market

Global Infrastructure as Code Market

Dublin, Aug. 18, 2022 (GLOBE NEWSWIRE) -- The "Global Infrastructure as Code (IaC) Market by Tool (Configuration Orchestration, Configuration Management), Service, Type (Declarative & Imperative), Infrastructure Type (Mutable & Immutable), Deployment Mode, Vertical and Region - Forecast to 2027" report has been added to ResearchAndMarkets.com's offering.

The infrastructure as code market size to grow from USD 0.8 billion in 2022 to USD 2.3 billion by 2027, at a Compound Annual Growth Rate (CAGR) of 24.0% during the forecast period. IaC technologies can assist in releasing system administrators from laboring over manual procedures and allowing application developers to concentrate on what they do effectively by optimizing and refactoring infrastructure builds.

Programs, configuration data, and automation devices are used in the infrastructure as code (IaC) deployment and management approach. This method may be used to cloud services as well as to hardware including web servers, routers, databases, load balancers, and personal PCs.

It is distinct from conventional infrastructure management, which depends on mechanical or interactive device configuration. IaC refers to a high-level build connections rather than a particular method, device, or protocol. Utilizing approaches for automated testing and quality control, Infrastructure as Code makes use of the software development process. Instead of manually altering the infrastructure, modifications to the configuration are accomplished by altering the program.

Based on Component, tool segment to register for the largest market size during the forecast period

Based on Component, the infrastructure as code market is segmented into tools and services. The market size of the tools segment is estimated to be the largest during the forecast period. Infrastructure as code (IaC) is the process of managing and provisioning computer data centers through machine-readable definition files rather than physical hardware configuration or interactive configuration tools. Many tools fulfill infrastructure automation capabilities and use IaC. The framework or tool that performs changes or configures infrastructure declaratively or imperatively based on a programmatic approach can be considered under IaC. Traditionally, server (lifecycle) automation and configuration management tools were used to accomplish IaC. Now, enterprises are also using continuous configuration automation tools or stand-alone IaC frameworks, such as Microsoft's PowerShell DSC or AWS CloudFormation.

Story continues

The Imperative segment to account for the highest CAGR during the forecast period

Based on type, the infrastructure as code market is segmented into declarative and imperative. The imperative segment is expected grow at a higher CAGR during the forecast period. The imperative solution helps to prepare automation scripts that provide the client's infrastructure one specific step at a time. While this can be more work to manage as it gets scaled, it can be easier for existing administrative staff to understand and leverage configuration scripts that already exist. With an imperative approach, a developer writes a code specifying the computer's steps to accomplish the goal. This is referred to as algorithmic programming. In contrast, a functional approach involves composing the problem as a set of functions to be executed.

Asia Pacific to hold highest CAGR during the forecast period

The Asia Pacific infrastructure as code market is expected to grow at the highest CAGR of 27.5% from 2022 to 2027, due to growing industrialization in this region. In this region, the adoption of new and emerging technologies has gained momentum in recent years. Public cloud is gaining huge adoption due to its low costs, on-demand availability, and improved security.

The availability of skilled labor and the keen focus of SMEs and large enterprises to enter and grow in this region are a few factors driving the adoption of the IaC market. Asia Pacific is expected to witness significant growth during the forecast period. The region has always been cautious about investment plans in terms of funding. Major players, such as Microsoft, AWS, Google, and IBM, are expanding their cloud and IaC rapidly in this region due to the increasing number of customers and growing economic outlook. The increasing adoption of emerging technologies, such as big data, IoT, and analytics, is expected to drive the growth of the IaC market in Asia Pacific region.

Market DynamicsDrivers

Restraints

Opportunities

Challenges

Key Topics Covered:

1 Introduction

2 Research Methodology

3 Executive Summary

4 Premium Insights

5 Market Overview and Industry Trends

6 Infrastructure as Code Market, by Component

7 Infrastructure as Code, by Type

8 Infrastructure as Code, by Infrastructure Type

9 Infrastructure as Code Market, by Organization Size

10 Infrastructure as Code, by Deployment Mode

11 Infrastructure as Code Market, by Vertical

12 Infrastructure as Code Market, by Region

13 Competitive Landscape

14 Company Profiles

15 Adjacent Markets

16 Appendix

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/19hrj0

Attachment

Read this article:
Global Infrastructure as Code (IaC) Market Report 2022: Advent of Modern Cloud Architecture & Demand for Better Optimization of Business...

Why Hybrid Cloud and Why Now? – RTInsights

Hybrid cloud deployments are the new norm, allowing workloads and data to run and reside on a combination of on-premises systems, private clouds, and public cloud services.

Hybrid cloud lets businesses deploy workloads and data on a mix of on-premises, private cloud, or public cloud infrastructure. That gives businesses great flexibility and allows them to optimize workloads by selectively matching infrastructure to needs. And as such, hybrid cloud is ideally suited to the evolving infrastructure needs of modern business.

How much interest is there in hybrid cloud? Some industry surveys found hybrid cloud adoption is on the rise. Roughly half (48 percent) of the respondents in one surveyplan to migrate 50 percent or more of their applications to a cloud this year. That aligns with the findings of another survey conducted last year that found that more than a third (38 percent) of the organizations already had a hybrid cloud strategy in place.

Whats driving the interest in hybrid? Hybrid cloud goes well with the embracement of cloud-native development and the composable enterprise. New applications are being built as an assembly of components based on microservices and APIs. Similarly, old monolithic applications are being deconstructed, offering up particular services, code, or data for use in new applications. A hybrid cloud strategy allows businesses to run different elements of such distributed applications where they would run best in the most cost-effective manner.

Digital transformation is another driver for the adoption of hybrid cloud. Digital transformation is about developing innovative offerings quickly and meeting customer availability and app performance expectations. A hybrid cloud strategy makes that all possible so that software developers can iterate as quickly as possible without having to wait for IT to set up servers, storage, and networking on-premises. Yet, it allows a business to keep apps and data that need to be secured on-premises.

Hybrid cloud also gives businesses the ability to place and shift workloads and data to the right infrastructure as requirements evolve. For example, it might make sense to move data on a public cloud back on-premises or to a private cloud if data privacy regulations are strengthened in a particular region. Similarly, an AI or ML model might be proven out on-premises but moved to a highly-scalable public cloud when deployed in production so that the needed compute power is available.

Such deployment flexibility is increasingly essential. Some applications will be best targeted to one type of cloud early in its development life cycle, only to have that shift to another later on. For example, it is quite common for a new consumer application to have wildly varying demand when first released, but over time to have its usage settle. A hybrid approach gives a business the option to use a public cloud to quickly scale and meet surges in demand but then bring an app back on-premises when demand plateaus to reduce costs.

See also: Moving Your Data to the Cloud? Read This First

The old adage whats old is new again rings true for hybrid cloud. Back in 2009, the National Institute of Standards and Technology (NIST) started developing a working definition of cloud computing. After 16 iterations, it finalized its work, noting that multiple deployment models included private, public, and hybrid, which together offered a new way to deliver services.

The NIST touted reasons for using hybrid cloud back then were that cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. And it noted that organizations that used cloud were more likely to reap the promised benefits of cloudcost savings, energy savings, rapid deployment, and customer empowerment.

It is safe to say those insights hold true today. But what of the future? Hybrid cloud reflects enterprise ITs overall and ongoing shift from a centralized to a distributed model. And perhaps the best indication of hybrid clouds value going forward is the synergies it brings with edge.

Edge is exploding. Sensors, IoT devices, and other data-producing units provide status and operational data for everything from an IT device to equipment on a production line to a pallet in a tracker trailer. Rapid analysis of that data is key to many aspects of business operation.

As such, there is a great overlap between hybrid cloud and edge. Hybrid cloud is about running workloads in the best possible environments; edge computing is aboutbringing the environment to a workloadand its associated data.

Hybrid deployments are the new norm. Most businesses are shifting workloads to the cloud or building new cloud-native applications from scratch. With few exceptions (e.g., pure cloud-native startups), workloads and data will run and reside on a combination of on-premises systems, private clouds, and public cloud services.

Melding these deployment options gives businesses the ability to support digital transformation efforts while also scaling existing applications and services.

Read the rest here:
Why Hybrid Cloud and Why Now? - RTInsights

TikTok pushes back on House officials warning about use – The Hill

TikTok pushed back on claims that a top House official made last week about security concerns related to the social media platform.

The company sent a letter to Chief Administrative Officer of the House Catherine Szpindor on Thursday, obtained by Politico. The company wrote that her TikTok Cyber Advisory contains inaccurate information and needs to be rescinded. The advisory warned lawmakers of security risks with the platform.

When asked for confirmation, a spokesperson for TikTok said the letter Politico published appears to be authentic.

Szpindors office said in the memo that the office considers TikTok as a high risk due to a lack of transparency about how it protects user data, its requirement of excessive permissions for people to use the app and potential security risks associated with using it.

The office concluded that members of Congress should not download or use the app because of these security and privacy concerns.

The memo said the app stores users data location, photos and other personally identifiable information in servers in China, but TikTok rejected that in its letter, saying it stores information in its own data centers in the United States and Singapore.

Michael Beckerman, vice president and head of public policy in the Americas for TikTok, said in the letter that all the U.S. user traffic is being directed to Oracle Cloud Infrastructure and that the company expects to delete users personal information from the data centers and fully pivot to Oracle cloud servers in the U.S.

He denied claims in the memo that the platform uses facial recognition technology or uses face and voice data to identify users.

Beckerman said that TikTok does not automatically collect precise GPS location data, as the memo states. The letter states that the company does collect users approximate location based on their SIM card and IP address, which helps improve the user experience, comply with local laws and prevent fraud.

He said TikTok does not collect information such as the SIM cards serial number, active subscription information or integrated circuit card identification numbers.

Beckerman said he wants to meet with Szpindor to discuss the information in TikToks letter.

Szpindors memo came as an increasing number of lawmakers have begun using TikTok to convey messages and reach new demographics ahead of the November midterm elections.

Almost every Democratic lawmaker voted for a provision in last years defense policy bill that prevented government employees from using TikTok on any government-issued device, but numerous caucus members have posted content on the platform.

Read this article:
TikTok pushes back on House officials warning about use - The Hill

The Global Endpoint Detection and Response Market size is expected to reach $7.1 billion by 2028, rising at a market growth of 24.8% CAGR during the…

ReportLinker

Endpoint detection and response (EDR) has several benefits, including the capacity to identify threats in real-time and enable an immediate halt to a breach. Additionally, EDR tools help analyze data to spot threat patterns, automatically react to threats to neutralize them or take other measures, and alert security staff.

New York, Aug. 19, 2022 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Global Endpoint Detection and Response Market Size, Share & Industry Trends Analysis Report By Vertical, By Component, By Deployment Type, By Organization Size, By Enforcement Point, By Regional Outlook and Forecast, 2022 - 2028" - https://www.reportlinker.com/p06315005/?utm_source=GNW It can be used as a forensics tool to support an investigation, which is anticipated to offer the industry lucrative growth potential. Furthermore, it tracks events and processes including registry changes, logins, and network connections to enable real-time visibility across all endpoints.

That makes it possible for the group or businesses to look into any suspicious conduct right away. The workload involved in identifying and responding to threats or hazards can also be lessened with the use of enhanced endpoint detection and response (EDR) processes. Removing the requirement for the security team to constantly monitor several tools and dashboards, which fuel the markets growth, saves time and resources.

Technology platforms known as EDR tools enable quick analysis and containment of attacks on endpoints in addition to warning security teams of potentially harmful activities. A workstation or laptop used by an employee, a server, a cloud system, a smartphone, or an IoT device are all examples of endpoints. EDR systems typically aggregate endpoint data, such as process execution, terminal communication, and customer logins; analyze data to find anomalies and suspicious attacks; and record data about harmful conduct. This gives security teams the information they need to look into and handle issues. They also make it possible to perform manual and automatic tasks to neutralize threats on the endpoint, such as wiping and reimaging the device or isolating it from the network.

Security teams now have the insight they need to find problems that might otherwise go undetected due to EDR security solutions, which keep track of all endpoint and workload activity and events. Continuous and thorough visibility into what is occurring on endpoints in real-time must be offered by an EDR system. An EDR tool should have sophisticated capabilities for threat detection, investigation, and response, such as incident data search, investigation alert triage, and validation of suspicious activity, threat hunting, and detection and containment of malicious activity.

COVID-19 Impact Analysis

The COVID-19 pandemic has caused social unrest and economic turbulence all across the world. The lockdown imposed by many governments has positively affected the adoption of endpoint detection and response (EDR). Following the effects of COVID-19, businesses are focusing on cutting-edge technologies to perform contactless operations safely and securely. These technologies include AI-powered remedies, computing technology, automation, and cloud-based endpoint detection and response across industries like BFSI, healthcare, government, and others. Additionally, as companies transition to digitalization, there is a growing need for an EDR solution that is reliable, AI-integrated, and has real-time reporting capabilities.

Market Growth Factors

Malware, crypto-jacking, and enterprise email hacks are all on the rise.

Crypto-jacking incidents drive companies to update unsecure systems, eliminating a potential point of entry for ransomware. Organizations implemented recommended fixes to address the exploited vulnerability. Every single moment in the web application compromises Expel conducted research that led to the introduction of a coin miner in 2021. This typically occurs when a victim unwittingly installs malicious scripts that let a cybercriminal access the device or other Internet-connected devices, such as by clicking on an unknown link in an e-mail or going to a malicious website. Because they are digital, cryptocurrencies can only be produced by computer programmers and processing power.

Security rules and compliance reporting can both be changed.

A companys rules and laws are unique to it. They may include goals, procedures, and risks that are specific to that company. Using the finest EDR tools, each user can create a security policy based on a customized rules engine. By using this engine, users can implement security and operational guidelines that will then be updated to address shifting business needs, changing threats, and applicable laws and regulations. Using custom security rules, the EDR team can filter out unimportant noise and remain focused on identifying both predictable and unanticipated risks. A flexible rules engine helps the providers efficiency and accuracy, enabling them to identify and protect against a wider range of threats in the particular environment of the business.

Market Restraining Factors

Poor And Erratic Third-Party Software

The security of the third-party infrastructure and a loss of control are just two disadvantages of outsourcing security tasks to a third-party EDR network operator. The service companys cyber architecture must be secure and up to date if it is to be effective in fending off the most recent sophisticated threats. An EDR service providers infrastructure may include sensitive customer and employee information from multiple companies, making it more susceptible to frequent and intricate attacks. Because of this, businesses could be reluctant to provide these service providers access to valuable data.

Component Outlook

On the basis of the Component, the Endpoint Detection and Response market is divided into Solution and services. The services segment witnessed a significant revenue share in the endpoint detection and response market in 2021. It is because The Endpoint Detection and Response service provides a skilled team of threat hunters, cutting-edge EDR technology, and up-to-the-minute threat data to discover threats that other measures may miss. This dramatically improves the visibility of attacks aimed at endpoint devices.

Deployment Mode Outlook

Based on the Deployment Mode, the Endpoint Detection and Response market is classified into On-premise and Cloud. The on-premise segment procured the largest revenue share in the endpoint detection and response market in 2021. This is due to the many benefits that come with on-premises deployment, including the high level of data protection and safety. Companies favor on-premises models over cloud-based deployment models because they offer greater data security and experience fewer data breaches, which is what drives demand in both the public and private sectors.

Organization Size Outlook

By Organization Size, the Endpoint Detection and Response market is bifurcated into Large Enterprises and SMEs. The small & medium enterprises segment recorded a significant revenue share in the endpoint detection and response market in 2021. It is because cyber-attacks on SMEs result in severe data and financial losses. Since most businesses are struggling as a result of COVID-19, banks are seeing an increase in demand for business loans, especially from SMEs, to maintain operations during the pandemic, reduce operational risks associated with remote working practices, and comply with regulations to avoid fines.

Enforcement Point Outlook

Based on the Enforcement point, the Endpoint Detection and Response market is segmented into Workstations, Mobile devices, Servers, Point of sale terminals, and others. The workstation segment acquired the largest revenue share in the endpoint detection and response market in 2021. Enterprises often offer alternatives for remote management so security professionals can set the proper parameters. To be audited and analyzed, each endpoint transmits audit data to a central repository. Consumers dont require the same level of centralized management. A specialized computer called a workstation is made for use in technical or scientific tasks.

Vertical Outlook

On the basis of the vertical, the Endpoint Detection and Response market is fragmented into Retail, Healthcare & Life Sciences, Manufacturing & Automotive, Government & Public Sector, BFSI, IT &Telecom, and Others. The IT & Telecom segment procured the highest revenue share in the endpoint detection and response market in 2021. It has been established that automatic speech recognition systems depend critically on endpoint detection. The system analyses the signals static (energy) and dynamic (delta energy) characteristics frame by frame. The endpoint detector can be adjusted to the working environment such as telephone lines and can keep track of changes in the background noise levels.

Regional Outlook

Region-wise, the Endpoint Detection, and Response market is analyzed across North America, Europe, Asia Pacific, and LAMEA. The North America segment acquired the highest revenue share in the endpoint detection and response market in 2021. In the North American region, the market is anticipated to exhibit robust growth due to the rising use of endpoint detection and response for security applications in forensic, governmental, and banking & finance. However, during the Endpoint Detection and Response Market Forecast period, the Asia-Pacific region is anticipated to develop at the fastest rate. This area has a strong IT infrastructure, good software, and reliable services.

The major strategies followed by the market participants are Partnerships. Based on the Analysis presented in the Cardinal matrix; Microsoft Corporation is the forerunner in the Endpoint Detection and Response Market. Companies such as Cybereason Inc., Broadcom, Inc. and SentinelOne, Inc. are some of the key innovators in Endpoint Detection and Response Market.

The market research report covers the analysis of key stake holders of the market. Key companies profiled in the report include Broadcom, Inc., Cisco Systems, Inc., Microsoft Corporation, OpenText Corporation, Crowdstrike Holdings, Inc., McAfee Corp., VMware, Inc., Palo Alto Networks, Inc., SentinelOne, Inc., and Cybereason Inc.

Recent Strategies deployed in Endpoint Detection and Response Market

Partnerships, Collaborations and Agreements:

Dec-2021: Cybereason came into a partnership with M.Tech, one of the biggest cybersecurity and network performance solutions suppliers. Through this partnership, M.Tech would provide consumers a Managed Detection and Response solution supported by the Cybereason XDR Platform. Additionally, the platform offers Extended Detection and Response, Next-Generation Antivirus, Endpoint Detection and Response, and Predictive Ransomware Security to business consumers by way of a single suitable mechanism.

Sep-2021: SentinelOne formed a partnership with Deepwatch, the leader in advanced managed detection and response (MDR) security. Through this partnership, Deepwatch would provide SentinelOne Endpoint Detection and Response as the territory of its award-winning MDR portfolio. Additionally, The integration of SentinelOne endpoint protection and Deepwatch MDR allows detection engineers to grab more endpoint data that can be consumed into the greater Deepwatch SecOps Platform for contextualization and correlation.

Jul-2021: SentinelOne partnered with ConnectWise, the foremost provider of enterprise automation software for technology solution suppliers. Together, the companies aimed to support their cooperative fight against cyber criminals by delivering SentinelOne Control and SentinelOne Complete as intended products in the ConnectWise security product offering.

Jul-2021: Cybereason partnered with CYDERES, the security-as-a-service division of Fishtech Group. Through this partnership, the companies aimed to bring improved Managed Detection and Response to future and currently managed security clients. Additionally, CYDERES and Cybereason mutual consumers would instantly be satisfied with the Cybereason Defense Platform, which integrates EDR, preventative next-gen antivirus, and proactive threat tracking to decrease cyber threats.

Jun-2021: SentinelOne came into a partnership with Coalition, the superior cyber insurance and security business. Together, the companies aimed to deliver an integrated solution of premium threat surveillance and cyber insurance to safeguard companies from ransomware. Additionally, The partnership would provide consumers of both Coalition and SentinelOne financial incentives to embrace EDR solutions and cyber insurance to encourage a more holistic risk management approach.

Jan-2021: Cybereason came into a partnership with Ensign InfoSecurity, one of the largest pure-play cybersecurity firms. This partnership allows joint consumers to permit next-generation detection, elite threat hunting, and response abilities and provides actionable safety intelligence to manage growing cyber threats encountered by companies and their dispersed workforce. Additionally, This partnership would help Ensigns consumers to access Cybereasons superior Cybereason Defense Platform, a mixture of endpoint detection and response and endpoint prevention abilities.

Mar-2020: SentinelOne formed a partnership with Lenovo, a Chinese multinational technology corporation. This partnership aimed to combine SentinelOnes autonomous endpoint protection platform with Lenovos ThinkShield security offering. Additionally, Lenovo consumers now can buy machines with SentinelOne, providing IoT security, real-time protection, ActiveEDR, and cloud workload security supported by patented Behavioral AI.

Product Launches and Product Expansions:

Oct-2021: CrowdStrike introduced Falcon XDR, providing security teams. The new Falcon XDR extends CrowdStrikes superior Endpoint Detection and Response (EDR) abilities to provide real-time automated detection response within the complete security pile. Additionally, Falcon XDR supplies security groups with the data and equipment to react, contain and remediate refined attacks quickly.

Feb-2020: Cisco unveiled Cisco SecureX, a combined cloud-native security platform. The new Cisco SecureX delivers a complete user experience around the range of Ciscos combined security offering and consumers living security systems. Moreover, Cisco SecureX provides transparency, recognizes anonymous threats, and automates procedures to bolster consumers protection within the endpoint, network, cloud, and applications.

Acquisitions and Mergers:

Jun-2021: Cisco acquired Kenna Security, a cybersecurity leader that delivers a risk-based vulnerability management platform. Through this acquisition, the company aimed to obtain Kennas open market-superior machine-learning risk-based security control medium together within Ciscos open SecureX platform.

Jan-2021: Cisco took over Banzai Cloud, a privately held company that specializes in deploying cloud-native applications. Under this acquisition, the Banzai Cloud team boosts Ciscos abilities and expertise through its established experience with total end-to-end cloud-native deployment, application, runtime, development, and security operations.

Sep-2020: CrowdStrike completed the acquisition of Preempt Security, a foremost provider of Zero Trust and conditional access technology. Under this acquisition, CrowdStrike would provide consumers with improved Zero Trust security abilities and bolster the CrowdStrike Falcon platform with dependent access technology. Additionally, the acquisition would assist consumers to reach end-to-end transparency and enforcement of identity data.

Mar-2020: McAfee took over Light Point Security, pioneer of browser isolation. Through this acquisition, the company aimed to integrate Light Point Securitys robust browser isolation abilities to the McAfee product offering along with its award-winning crew. Additionally, Browser isolation safeguards consumer against web-based hazards because vicious code is included in an isolated ecosystem and cannot achieve the endpoint.

Scope of the Study

Market Segments covered in the Report:

By Vertical

IT & Telecom

BFSI

Manufacturing & Automotive

Retail

Government & Public Sector

Healthcare & Life Sciences

Others

By Component

Solution

Services

By Deployment Type

On-premise

Cloud

By Organization Size

Large Enterprises

Small & Medium Enterprises

By Enforcement Point

Workstations

Servers

Mobile Devices

Point of Sale Terminals

Others

By Geography

North America

o US

o Canada

o Mexico

o Rest of North America

Europe

o Germany

o UK

o France

o Russia

o Spain

o Italy

o Rest of Europe

Asia Pacific

o China

o Japan

o India

o South Korea

o Singapore

o Malaysia

o Rest of Asia Pacific

LAMEA

o Brazil

o Argentina

o UAE

o Saudi Arabia

o South Africa

Read more:
The Global Endpoint Detection and Response Market size is expected to reach $7.1 billion by 2028, rising at a market growth of 24.8% CAGR during the...