Category Archives: Cloud Servers

TRAI Exploring Emergency Ban on WhatsApp and Facebook – IndianWeb2.com

TRAI releases Consultation Paper on Regulatory Mechanism for Over-The-Top (OTT) Communication Services, and Selective Banning of OTT Services.

Department of Telecommunications (DoT) has requested TRAI to suggest a suitable regulatory mechanism for OTTs, including issues relating to 'selective banning of O1T services' as part of its recommendations to DoT.

Further, a parliamentary panel has recommended the DoT to examine the recommendation of TRAI and come out with a policy which will enable the selective banning of OTT services such as Facebook, WhatsApp and Telegram during unrest and crisis as these apps are prone to be used by terrorists or anti-national elements in specified regions.

In December 2021, a Parliaments Standing Committee on Communication and Information Technology submitted its 26th report titled Suspension of telecom services/ Internet and its impact.

The Committee feels that it will be of great relief if the DoT can explore the option of banning selective services, such as Facebook, WhatsApp, Telegram etc. instead of banning the Internet as a whole. This will allow financial services, health, education, and various other services to continue to operate for business as usual thereby minimizing inconvenience and suffering to the general public and also help in controlling spreading of misinformation during unrest.

Through the letter dated 7th September 2022, DoT has also mentioned that "in view of the humongous growth of services in the recent past and these services having reached a matured stage, there is a need to holistically look into the various aspects of these services induding regulatory, economic, security, privacy, and safety aspects. This is also in keeping with para 2.2 of the National Digital Communications Policy - 2018 which mentions the policy goal for "Ensuring a holitic and harmonal approach for harnessing Emerging Technologies' It has been mentioned therein that a policy framework for 'Over the Top' services will be developed."

The consultation paper mentions "In some use cases such as messaging, consumer preferences have switched from traditional telecommunication services to OTTs. With OTTs used increasingly for messaging and to some extent for voice communication, globally, the general trend is a transition from voice and SMS towards data as a primary source of revenue for telecom service providers. In India, composition of the revenue basket of wireless access service providers has undergone a sea-change in the period from the year 2013 to 2022.

The websites that use dynamic IP addresses and are hosted on cloud servers can pose a challenge to conventional methods of blocking said the consultation paper. In such situations, such situations, alternative methods may be necessary to effectively control internet filtering. Advanced techniques can be employed to identify and block access to such websites.

Further, there may be scenarios where the targeted websites use Hypertext Transfer Protocol Secure (https) protocol. HTTPS protocol provides encryption and security for websites, making it difficult for service providers to block content on these sites. However, there are still ways to block or filter content at a network level, such as using a firewall or content filtering software. As far as area specific barring is concerned, it also needs to be carried out at network level, for which effective methods are required to be worked out.

The Department have also informed the parliamentary committee that Facebook, WhatsApp, Telegram etc. are basically categorized as OTT services in short. These OTT services are riding over the existing telecom service provider's network.

In the consultation paper, the regulator has sought views on the requirement of a framework for OTTs encompassing lawful interception, privacy and security, customer verification, pesky calls and messages, technical challenges in banning specific service etc.

In the past several consultation papers, TRAI has set aside the demand for regulating internet-based calling and messaging apps.

TRAI has set August 4 as the last date for comments on the paper and August 18 for counter comments.

Follow this link:
TRAI Exploring Emergency Ban on WhatsApp and Facebook - IndianWeb2.com

Flare Network and Ankr Partner to Enhance Developer Experience with High-Performance RPC Infrastructure – CoinCodex

Key Highlights

Flare Network, the pioneering EVM-based smart contract platform focused on secure data acquisition and interoperability, has announced today a strategic partnership withAnkr, a decentralized Web3 infrastructure provider, to bolster its suite of remote procedure call (RPC) tools for developers.

Per the announcement, this strategic partnership aims to provide developers building on Flare with access to distributed and high-performance RPC services, streamlining dApp development processes and facilitating seamless node operations.

Paving the way for new use cases and monetization models with its unique interoperable cross-chain solution, Flare Network stands out as an EVM-based Layer-1 smart contract platform that extends the utility of blockchain technology. By optimizing decentralized off-chain data acquisition, Flare empowers developers with oracles built into the network's structure, providing a simple and coherent stack for decentralized interoperability. As an EVM-compatible chain, Flare enables the deployment of decentralized applications coded in Solidity, facilitating the trustless acquisition of price and time series data, blockchain events and state data, as well as web2 API data on a large scale.

The collaboration enables developers on the Flare Network to leverage Ankr's cutting-edge infrastructure, which includes advanced APIs with custom indexing. These APIs not only simplify onboarding for first-time users but also offer seamless querying for onchain data, includingnon-fungible token (NFT) metadata. By integrating Ankr's services, Flare Network aims to reduce friction and enhance the overall developer experience.

On the other hand, Ankr, a leading cloud infrastructure provider in the Web3 space, currently offers RPC services to over 35 different blockchains. Their services encompass both HTTPS and WebSocket RPC endpoints, leveraging bare metal cloud servers, alongside a comprehensive suite of developer tools. With more than 600 nodes located across 12 countries and five continents, Ankr's RPC services provide rapid and efficient dApp request handling. Utilizing load-balancing technology, Ankr ensures that all requests are automatically directed to the nearest data center, guaranteeing 99.99% uptime. Furthermore, with an impressive average response time of just 110ms, Ankr's nodes exhibit industry-leading speeds and low latency.

Hugo Philion, the CEO and co-founder of Flare, emphasized the immense benefits that developers would gain from the partnership with Ankr. He highlighted that this alliance allows developers on Flare to unlock the power of premium RPC tooling provided by Ankr, the leading node provider in the blockchain industry. In addition to the API Portal and Google Cloud infrastructure, this strategic partnership underscores Flare's unwavering dedication to empowering developers with a versatile, resilient, and trustworthy stack. The goal is to provide developers with the necessary tools to effortlessly build and monitor their dApps and services, all while ensuring enterprise-grade levels of availability and security.

View original post here:
Flare Network and Ankr Partner to Enhance Developer Experience with High-Performance RPC Infrastructure - CoinCodex

Africa Telecom Towers and Allied Market Size & Share Analysis – Growth Trends & Forecasts (2023 – 2028) – Yahoo Finance

ReportLinker

The Africa Telecom Towers and Allied Market size in terms of installed base is expected to grow from 199,092 units in 2023 to 249,652 units by 2028, at a CAGR of 4.63% during the forecast period (2023-2028).

New York, July 05, 2023 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Africa Telecom Towers and Allied Market Size & Share Analysis - Growth Trends & Forecasts (2023 - 2028)" - https://www.reportlinker.com/p06472491/?utm_source=GNW

With the outbreak of COVID-19, the telecom industry has witnessed a significant increase in demand for internet services due to a major chunk of the population staying at home and remote working conditions. The increase in people working from home has led to an increase in demand for downloading, online video viewing, and communication through video conferencing, all of which are leading to increased network traffic and data usage.

Key HighlightsThe telecom tower industry has drastically evolved over the past decade. The core towerco proposition and business models have been successfully adapted to match the demands of new markets in Africa. Many towercos are anticipated to hunker down in their core building business over the forecast period, buying and leasing vertical real estate, and such towercos may still see plenty of 5G antenna overlaid onto their towers.As per the Nigerian Communications Commission, as of January 2021, the number of third and fourth-generation telecom towers deployed in Nigeria has grown by 73.2%. Also, global tower companies are expanding their presence in the region, through strategic collaborations, due to the growing number of opportunities presented.The emergence of KaiOS and its partnerships with operators across Africa is helping overcome the affordability barrier for low-income users. The free resources offered, such as the Life app, also help new users develop digital skills and understand how the internet can be relevant. Such initiatives are expected to boost internet penetration in these countries significantly.Several initiatives by telecom operators and other organizations, especially in low and middle-income countries, are expected to spur growth in the rural areas as the residents of these areas gain increased access to internet connectivity.Furthermore, with businesses going mobile and adopting new concepts, like BYOD, to increase employee interaction and ease of use, it has become essential to provide a high-speed and quality network. The organizations have been looking forward to adopting BYOD aggressively in their operations, fueling the market growth over the forecast period. Furthermore, development in cloud-based services for mobile users and the roll-out of 4G LTE services worldwide have increased the investment into networks by carriers, which drives the demand for telecom towers.The increasing emphasis on improving internet connectivity to rural areas is one of the major factors stimulating the deployment and improvisation of the telecom infrastructure in these areas, thereby aiding the markets growth. Smartphone penetration, raising awareness, increasing penetration of digital technologies, and investments from several organizations and governments have been increasing the adoption of internet connections in the region.

Africa Telecom Towers & Allied Market Trends

Optical Fiber Market is Expected to Grow Significantly During the Forecast Period

The telecommunication and networking market witnessed a massive surge in demand in the region. The emergence of IoT in cloud computing and the demand for 5G networks are driving increased usage of optical fibers in a wide variety of applications: business, government, industrial, academic, and cloud servers in public and private networks.As per CommsUpdate, there was an interval of just three years between launching 3G and 4G services in Algeria, leading to issues. While all three major operators - Djezzy, Mobilis, and Ooredoo - have extended their coverage to all 48 provinces, they all received penalties from the regulator in 2020 for the poor quality of their services.Further, as per the Nigerian Communications Commission, as of January 2021, fiber optics cables have expanded by 16.4 % in the last five years. Also, according to IFC, a total of 1.1 million km of fiber optics have been installed in Africa, of which 50% have been deployed by private mobile network operators (MNOs). Moreover, about 40% of all fiber optic cable in Africa, a staggering 450,000 kilometers, is publicly owned. This includes government networks, state-owned enterprises (SOEs), and utilities.The region has a developing telecom infrastructure, with growth encouraged by supportive regulatory measures and by government policies aimed at delivering serviceable internet connections across the region. Government-funded efforts, including the Universal Service Telecommunications (UTS) program, continue to ensure that fixed-line infrastructure is extended to underserved areas. Thus the slow growth in the number of fixed-telephony connections should be maintained during the next few years.Companies are getting into various partnerships to provide better services while controlling operating costs. For instance, during the beginning of 2021, Ooredoo Algeria deployed Nokias cloud-native Core software to strengthen its network performance and reliability cost-effectively and strategically positioned itself to launch new services to meet customer needs. This deployment is likely to improve the digital ecosystem of the country further.The infrastructure is based on a terrestrial fiber-optic network coupled with undersea cables, offering secure connectivity abroad from West Africa. This investment aims to support the digital ecosystem and meet the regions growing needs for connectivity.

Telecom Tower Market to Grow Significantly during the Forecast Period

The core towerco proposition and business models have been successfully adapted to match the demand from new markets in Africa. Many towercos are anticipated to hunker down in their core business of building, buying, and leasing vertical real estate over the forecast period. Such towercos may still see plenty of 5G antenna overlaid onto their towers.As per the Nigerian Communications Commission, as of January 2021, the number of third and fourth-generation telecom towers deployed in Nigeria has grown by 73.2%. Also, global tower companies have been expanding their presence in the region, through strategic collaborations, due to the growing number of opportunities presented.In January 2020, American Tower acquired Eaton Towers in a deal that included towers across five African countries. While American Tower already had a presence in Africa, the acquisition was a significant deal, demonstrating the types of investment being made in the region, particularly in the tower market.According to estimates by TowerXchange, there are roughly 25,767 towers in South Africa, serving 97mn SIMs, making it one of Africas best-covered markets. Five MNOs operate within the South African market, mainly MTN, Vodacom, Telkom, Cell, and C-and Rain. Cell C is in the process of shutting down its network and switching to a roaming agreement with MTN. They have slowly reached a point of bankruptcy since the sale of their portfolio to American Towers.Furthermore, with businesses going mobile and adopting new concepts like BYOD to increase employee interaction and ease of use, it has become essential to provide a high-speed and quality network. The organizations are looking forward to adopting BYOD aggressively in their operations, thereby fueling the market growth over the forecast period. Further, growth in cloud-based services for mobile users and the roll-out of 4G LTE services worldwide have increased the investment into networks by carriers, which drives the demand for telecom towers.Also, the increasing emphasis on improving internet connectivity to rural areas is one of the major factors stimulating the deployment and improvisation of the telecom infrastructure in these areas, thereby aiding the markets growth. Smartphone penetration, raising awareness, increasing penetration of digital technologies, and investments from several organizations and governments have been increasing the adoption of internet connections in the region.

Africa Telecom Towers & Allied Industry Overview

The African Telecom and Allied market is moderately competitive and consists of many global and regional players. These players account for a considerable market share and focus on expanding their client base globally. These players focus on research and development activities, strategic alliances, and other organic & inorganic growth strategies to stay in the market landscape over the forecast period.

March 2022 - Helios Towers, the independent telecommunications infrastructure company, announced the acquisition of Airtel Africas passive infrastructure company in Malawi, adding 723 sites to its portfolio.January 2022 - ZESCO Limited and Copperbelt Energy Corporation PLC signed an agreement to make new power supply and transmission arrangements. The negotiations, which started on January 17, 2022, are expected to replace the bulk supply agreement that expired on March 31, 2020.

Additional Benefits:

The market estimate (ME) sheet in Excel format3 months of analyst supportRead the full report: https://www.reportlinker.com/p06472491/?utm_source=GNW

About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

__________________________

Story continues

Read this article:
Africa Telecom Towers and Allied Market Size & Share Analysis - Growth Trends & Forecasts (2023 - 2028) - Yahoo Finance

Disaster recovery and the cloud – IT-Online

Risk assessment, risk planning, and risk mitigation. And then the disaster happens. Whether it be unplanned, prolonged load-shedding or a ransomware hack that takes your business offline. This is when disaster recovery and business continuity kick in in the Cloud.

By Reshal Seetahal, head of Alibaba cloud business unit at BCX

Moving services, operations and data to the cloud not only ensures security, resilience and recovery but gives businesses an advantage in terms of building efficiencies, maintaining service levels to their customers, and creating new revenue streams. When load-shedding hits and the lights go out, your data is not only backed-up and secure, but available and accessible.

Its not just business as usual, but an opportunity to increase productivity, scalability, agility, and performance. The cloud is, quite simply, key in the digital transformation of all entities. It is a strategic tool, providing a platform that can create and develop solutions tailored to your business as it grows and evolves.

It would be remiss not to talk about the potential harm of disasters on businesses. These are the times we live in. A power grid collapse, a fibre cable cut under the ocean, floods, malware, the list is long. Having an on-site, one-server backup for your data and your companys operating services is akin to keeping your cash stashed under your mattress. The only good thing is that you think you know where it is and have a false sense of control over it.

The bad thing? Everything else. Its an easy, vulnerable, one-source target that requires constant levels of scrutiny and security. When disaster hits, it is the first thing to go, the first thing to be targeted, and when it is gone, it is gone forever.

Not everyone has the resources to set up multiple backup servers. There is, surprisingly, still naivety among many businesses about their security just as there is unease and trepidation about moving away from an on-site, lock-and-key situation to a cloud solution.

The solution is often through a layered, nuanced system.

Alibaba Cloud Disaster Recovery (DR) supports warm standby, which acts as an extension of the organisations on-premises environment. During warm standby, a mirror environment offers a scaled-down version of a fully functional environment that remains running in the cloud. This minimises the recovery time and enables mission critical systems to meet stringent RTO (recovery time objective) and RPO (recovery point objective) targets.

Being ready, being able, and being aware of vulnerabilities, and potential disasters is the first step of becoming an entity that can work without fear of the loss of data and service. The cloud mitigates and negates the potential harm of a disaster, and creates an environment not just for recovery, but continuity.

Related

The rest is here:
Disaster recovery and the cloud - IT-Online

How AI, big tech, and spyware power Israel’s occupation – The New Arab

Automated Apartheid: How Israel's occupation is powered by big tech, AI, and spyware

In-depth: Israel's military occupation has become a laboratory for advanced surveillance systems, artificial intelligence, and spyware technology developed by Western corporations and Israel's army.

AI was a force multiplier, boasted Israeli officials after Operation Guardian of the Walls, an 11-day milliary attack on Gaza in 2021 which displaced over 91,000 Palestinians and left over 260 dead.

Almost two years later, foreign aid, big tech, and new advanced surveillance systems have quite literally laid the groundwork for what Amnesty International calls an Automated Apartheid, one that is powered by Western corporations like Google and Amazon on the outside, and entrenched by spyware and AI on the inside.

A new era: Occupation under automation

AI technology combined with a new far-right government have seen policies of repression in Israels military occupation escalate at an unprecedented rate over the last few years.

Autonomous weapon systems rely on sensor processing rather than human input, by selecting and engaging a target, Omar Shakir the Israel-Palestine Director at Human Rights Watch, told The New Arab.These technologies make it easier to maintain and further entrench apartheid.

Since the beginning of 2023, the Israeli army has killed over 170 Palestinians, including at least 30 children. More than 290 Palestinian-owned buildings across the West Bank and East Jerusalem have been demolished or forcibly seized, displacing over 400 people and affecting the livelihoods or access to services of over 11,000 others.

In a recent 82-page comprehensive report on the use of technology in Israels military occupation, Amnesty International detailed how many of these atrocities are made possible by automated weapons, spyware, and unauthorised biometric systems, calling them crimes against humanity.

Spyware hacks into devices (phones or computers) without alerting the owner. The hackers open the microphone and camera on the device remotely to spy on the surroundings, and download all of the data on the device, Dr Shir Hever, the military embargo coordinator for the Palestinian Boycott, Divestment and Sanctions National Committee (BDSNC), told The New Arab.

Pegasus spyware, the specific system used by the Israeli military, is not only used to breach peoples privacy by filing and scanning data but is also utilised to obtain information even from encrypted messaging services, and plant false evidence into the device without leaving a trace, Dr Hever added.

Most recently, Israels military has come under fire for their Wolf Pack facial recognition systems.

Nadim Nashif, the General Director and Co-Founder of 7amleh - The Arab Center for the Advancement of Social Media, explained how Wolf Pack is used to facilitate Israels occupation.

It's an extensive predatory surveillance database system that contains profiles of nearly every Palestinian in the occupied West Bank, including photographs, family histories, education, and security ratings, he said.

There are countless variations of the program - Red Wolf, Blue Wolf, and White Wolf - which all take information from Palestinians without consent.

Blue Wolf has a colour-coded system that instructs soldiers to either arrest the individual or let them pass through. Israeli soldiers compete to capture the highest number of pictures in the app, Nashif explained.

The updated version of Blue Wolf, Red Wolf is now being used in illegal checkpoints in Hebron. If the system cannot locate the individuals image, it will register it on the databases, and they will be often denied crossing, Nashif added.

A lesser-known version, White Wolf, is used on Palestinian workers who have jobs in illegal settlements. It has the same tracking, harassment, and biometric features as the other two.

The emergence of Smart Cities in Israel has also allowed these tools to be deployed to track and surveil Palestinians under the disguise of tech advancement.

Places like Jerusalem have Smart City technology, that uses cameras, facial recognition, and advanced technological systems that are used at the entries of checkpoints, said Shakir.

With cameras pointing into homes and scanning Palestinians at checkpoints and as they go about their everyday lives, reality under Israeli occupation is becoming increasingly dystopian.

Surveillance impacts our day-to-day activities and behaviours, adding to existing constraints to freedom of movement. We as Palestinians think twice before logging into the internet, using our phones to call a loved one, or meeting with friends in a public space. We are cautious with every move we make, every word we say, Nashif explained.

Residents in Hebron have become accustomed to the presence of drones flying over the city, he added. Data obtained by facial recognition surveillance technology will be used to supply information to an AI-controlled machine gun equipped with ready-to-fire stun grenades and sponge-tipped bullets, explaining how enforcing the occupation has become easier to sustain via technology.

In some cases, data gathered by surveillance methods are used for Israels policy of targeted assassinations, which are carried without legal processes.

Drones, remote-controlled vehicles in the air (UAV), water or land which usually carry surveillance equipment (mostly cameras), are now being used as armed drones to commit assassinations, Dr Hever elaborated.

Its another form of apartheid. Privacy is only a privilege for Jewish Israeli citizens, but not for the Indigenous population of Palestine, he said.

Western corporations: Buying and selling apartheid

While this technology is developed by the Israeli military internally, the means to do so often comes from foreign aid, notably Western corporations.

None of the technologies discussed here (drones, facial recognition, databases, etc.) is an Israeli invention, Dr Hever said.

Western or transnational corporations have a long history of being complicit in and profiting off Israels apartheid, added Apoorva G, the Asia Pacific campaigns coordinator for the BNC.

From sports companies like Puma, Big Oil corporations like Chevron, and even infrastructure companies, like Siemens and HD Hyundai, they (Big Tech) see oppression of Palestinians as a profitable project, which is related to the economic and environmental damage caused worldwide, Apoorva added.

A recent, more concerning contract between big tech and Israel is Amazons and Googles Project Nimbus - a $1.2 billion agreement that provides cloud services to the Israeli army.

Military attacks depend on servers and digital communication, surveillance entirely relies on such technology, databases storing information on Palestinian land records, population databases they all require cloud servers. All of this is now going to be provided by Google and Amazon. And this project is already underway, Apoorva told The New Arab.

Since 2021, workers at these corporations and human rights organisations have been organising against the contract through the #NoTechForAparthied movement, but their efforts have not led to substantial change.

Sometimes these corporations themselves create weapons and export them to Israeli intelligence, creating a buy-and-sell version of occupation. Sophia Goodfriend, a PhD Candidate at Duke University Anthropology examining the ethics and impact of new surveillance technologies, explains how the tech and defence industries intersect.

The IDF has a long history of outsourcing this R and D (research and development) to private start-ups, largely staffed by veterans of Israeli intelligence units, she said, citing companies like Oosto (formerly AnyVision), the NSO Group, and Black Cube, who have all been contracted to provide technology and services to Israel's military forces.

Global violence and repression

The fact that these systems are imported, bought, or sold has led to fears among researchers and activists about their global reach and impact on human rights.

These technologies are promoted by private Israeli arms companies who are selling them around the world, even in violation of military embargos, Dr Hever elaborated. Just recently it was revealed that Israeli arms companies sell lethal weapons to the Junta in Myanmar, despite the international arms embargo over the ethnic cleansing and genocideof the Rohingya people

We know this because this is the technology which the Israeli arms companies are putting up for sale with the slogan battle-tested, adds Apoorva.

The development of AI technology surveillance in oppressive regimes will make these situations more volatile, especially when sold to existing military and security hierarchies.

The more sophisticated the surveillance mechanisms, the greater their impact in terms of violence and repression is likely to be, Nashif said. The use and abuse of surveillance technologies have led to disproportionate profiling, policing, and the criminalisation of racialised groups worldwide. Palestinians are no exception to these repressive practices.

The global market for autonomous military weapons is also increasing as more and more of these systems get tested on Palestinians. These are global trends, not in just Israel, countries like India, Russia, and the UK, the US are heavily investing in the military application of AI, Shakir says, noting that Israel is one of the top exporters of such weaponry.

As the world becomes increasingly automated, digital rights are at the forefront of conversations within human rights organisations. AI technology, which is never neutral, will be fed with/taught past wrong decisions, reinforcing the bias against racialised communities, Nashif said.

Aina Marzia is an Independent Journalist based in El Paso, Texas. Her work has been seen in The Nation, The Daily Beast, Ms. Magazine, Insider, Teen Vogue, NPR, i_D, and more. When she is not writing Aina organises with the National Student Press Law Center, ACLU, and the UCLA Center for Storytellers and Scholars

Follow her on Twitter:@ainamarzia_

Go here to read the rest:
How AI, big tech, and spyware power Israel's occupation - The New Arab

How a midsize American business recovered from a ransomware … – SC Media

At the CyberRisk Leadership Exchange in Cincinnati on June 7, the chief security officer of an Ohio bottling company used his lunchtime keynote address to recount how his company's eight-person IT team detected, remediated and recovered from a ransomware attack within the space of seven hours, without losing any business and without paying a dime to the attackers.

"We never missed an order. We never missed a delivery. Customer data was not compromised," Brian Balzer, Executive VP of Digital Technology & Business Transformation at G&J Pepsi-Cola Drink Bottlers, Inc., told SC Media in an interview. "I'd say probably 95% of the organization had no idea that we were under attack."

Balzer credits G&J's rapid, successful recovery from the ransomware attack to diligent preparation and a move to cloud-based operations, and to strong support from both colleagues and company leaders.

"I cannot stress enough to companies the importance of having such a strong culture where people are willing to jump in and help one another," says Balzer, "and a leadership team that is supportive of the cyber team, your IT team, whoever it might be, your digital team, to be able to put plans in motion."

Founded in 1925, G&J Pepsi serves Ohio, Kentucky and West Virginia and is the largest family-owned independent Pepsi bottler in the U.S., with more than a dozen facilities, 2,000 employees and $650 million in annual revenue.

The company first noticed something was wrong with its systems just before the Labor Day weekend of 2021.

"We had suspected that we might have allowed an intruder, a hacker into our environment," Balzer says. "We spent the better part of about four or five days trying to understand where they got in, where they were, if they were in and how we might be potentially exposed."

The G&J Pepsi team suspected that someone had used Cobalt Strike to install "beacons," or backdoors, into the systems, but their initial searches found nothing. Then a few days after Labor Day, a call came in around 4:30 in the morning.

"We got a call from one of the folks in our plant saying, 'Hey, something's weird, I can't access files,'" recounts Balzer. "And we knew instantly that we were under attack."

The G&J Pepsi team quickly took as many systems offline as it could. Balzer credits support from the very top of the company for that.

"I [had] to call my CEO at five o'clock in the morning and say, 'We're literally bringing all the systems down.'" Balzer says. "He's like, 'All right, I trust you. You just keep me posted.'"

Balzer's team found two potential points of entry. The first was a user who had unknowingly downloaded a corrupted file a common vector for ransomware infection but G&J Pepsi's endpoint solution quickly detected and remediated that.

The second point of entry was more serious. Just before the long weekend, Microsoft had released a patch for Exchange Server. But it looked very similar to another Exchange Server patch from two weeks earlier, one that G&J Pepsi had already implemented.

"There was some confusion as to, 'Was this the same patch that they released? Or was this a different patch?'" Balzer recalls. "We just misunderstood. It was probably on us. It was our fault for not getting that clarification quickly."

Instead of implementing the new patch right away, G&J Pepsi decided to wait until the following weekend. That's all the time the ransomware crew identified as Conti by notes left on infected systems needed.

"Within four days, they had exploited that particular gap in that Exchange server, and were able to compromise our environment," Balzer says. But, he added, "we don't deal with terrorists."

As a midsize company, G&J Pepsi fit the profile of a prime target for ransomware crews. The fact that the attack happened over a three-day weekend, which gives attackers more time to operate freely, was likely no coincidence.

"Most midsize companies and small companies can't thwart an attack, particularly a Conti ransomware or other sophisticated attacks that are that are taking place," Balzer says. "When we called for support, we [were told] that 'We'll try to help you, but we are absolutely slammed coming out of this three-day weekend because they went haywire on companies across the U.S.'"

Fortunately, because G&J Pepsi had already moved all its systems to the cloud, shutting down company assets and stopping the attackers was less complicated than it might have been for on-premises infrastructure.

"We have nothing on-premise," Balzer says. "Because we're 100% in the cloud, and because we utilize Microsoft Azure Cloud environment, we were able to prevent them from moving laterally across the platforms in our systems."

The virtual nature of G&J Pepsi's systems meant that the company was able to spend the next few hours using its weekly backups to spin up brand-new Azure instances free of ransomware, even as the team continued to investigate the infected systems.

"Within seven hours, we were able to stand up the entire environment again," Balzer told us. "Many of our solutions are SaaS solutions. The things that were impacted were more like file servers we had a couple of other servers that we had developed as IAS solutions in Azure that were at risk. We were able to basically rebuild and recreate that environment."

G&J Pepsi were lucky. None of the company's backups had been affected by the ransomware, and dark-web scans turned up no evidence of company data having been stolen.

"We were very fortunate that we had eyes on it immediately and were able to basically isolate and wall them off and then rebuild our environment," Balzer says.

However, staff PCs left on overnight in the office were infected, as were some ancillary servers. Rebuilding those took a bit more time.

Following the attack, G&J Pepsi brought on Arctic Wolf as a managed detection and response (MDR) provider and changed several company policies.

"We forced all password resets, we changed our policies on backups, we changed our policies on how many admin accounts that we have we limited those and really revamped the security," Balzer says.

Balzer told us that G&J Pepsi had also locked down USB ports on PCs, beefed up identity and access management and automated its systems patching. As it is a U.S.-only company, G&J Pepsi also blocked all system access from outside the country. The company has not had any serious incidents since.

In a separate interview with Microsoft, G&J Pepsi Enterprise Infrastructure Director Eric McKinney says he has learned two things from the company's brush with ransomware.

"If I could go back in time to the months leading up to our ransomware attack, I'd tell myself to strengthen our endpoint policies," McKinney tells Microsoft. "I don't view our recovery as a victory so much as a call to double down on security."

For McKinney, the second lesson was how much there is to be gained from a full cloud migration.

"G&J Pepsi has gotten a wide range of security benefits, such as platform-based backups, cloud-based identities, and multifactor authentication, leveraging native tools that help recommend and identify risk," McKinney says. "It doesn't matter whether you're a huge corporation like PepsiCo, a midsize business like G&J Pepsi, or a mom-and-pop gas station down the road I would make that move to the cloud and make it quickly."

Fielding questions from the audience following his keynote address at the Cincinnati CyberRisk Leadership Exchange, Balzer was struck by how many of his fellow cybersecurity executives wanted to hear about G&J Pepsi's experience.

"I love that the participation was there, that the curiosity was there," he told us. "People wanted to understand what was happening so that they can be aware of what to do if that ever occurs with them."

But Balzer once more stressed how important company culture is to an organization's ability to maintain resilience and quickly recover from an attack.

"The other thing that really stuck out, that we talked about for a brief bit during that [CyberRisk Leadership Exchange] session," Balzer says, "was the importance of having the right culture within your team to be able to come together to thwart an attack, particularly one of that size or even larger.

"We had a plan in place. Unfortunately, we had to use it. But fortunately for us, that plan worked," adds Balzer. "And that worked because we had the right leadership, the most senior leadership and support, and we had the right culture within our team to help support that and thwart that attack."

The next Cybersecurity Collaboration Forum event in Cincinnati will be a CyberRisk CISO Dinner at the end of September.

For more information on the Cybersecurity Collaboration Forum, including how to attend a CyberRisk Leadership event in your area, please visit https://www.cybersecuritycollaboration.com/.

Many thanks to Zack Dethlefs of the Cybersecurity Collaboration Forum.

See the original post:
How a midsize American business recovered from a ransomware ... - SC Media

PCI-Express Must Match The Cadence Of Compute Engines And Networks – The Next Platform

When system architects sit down to design their next platforms, they start by looking at a bunch of roadmaps from suppliers of CPUs, accelerators, memory, flash, network interface cards and PCI-Express controllers and switches. And the switches are increasingly important in system designs that have a mix of compute and memory types and for clusters that will be sharing components like accelerators and memory.

The trouble is this: The roadmaps are not really aligned well. Most CPU and GPU makers are trying to do major compute engine upgrades every two years, with architectural and process tweaks in the year in between the major launches so they have something new to sell every year. Makers of chips for networking switches and interface cards in the Ethernet and InfiniBand markets tend to be on a two-year cadence as well, and they used to tie their launches very tightly to the Intel Xeon CPU launch cadence back when that was the dominant CPU in the datacenter, but that rhythm has been broken by the constantly redrawn roadmaps from Intel, the re-emergence of AMD as a CPU supplier, and a bunch of other Arm CPU makers, including at least three hyperscalers and cloud builders.

And then there is the PCI-Express bus, which has been all over the place in the past two decades. And while PCI-Express specifications have been released in a more predictable fashion in recent years, PCI-Express controllers have been faithful to the PCI-Express roadmaps but PCI-Express switches are well behind when it comes to product launches from MicroChip and Broadcom.

Sitting here on a quiet July morning, thinking about stuff, we think all of these roadmaps need to be better aligned. And specifically, we think that the PCI-SIG organization that controls the PCI-Express specification and does so through a broad and deep collaboration with the IT industry, needs to pick up the pace and get on a two-year cadence instead of the average of three it has shown in the past two decades. And while we are thinking about it, we think the industry would be better served with a short-cadence jump to PCI-Express 7.0, which needs to be launched as soon as possible to get I/O bandwidth and lane counts in better alignment with high throughput compute engines and what we expect will be an increasing use of the PCI-Express bus to handle CXL-based tiered and shared main memory.

Dont get us wrong. We are grateful that the PCI-SIG organization, a collaboration between all kinds of companies in the datacenter and now at the edge, has been able to get the PCI-Express bus on a predictable roadmap since the very late PCI-Express 4.0 spec was delivered in 2017. There were some tough signaling and materials challenges that kept the datacenter stuck at PCI-Express 3.0 for seven years, and we think Intel, which dominated CPUs at the time and dragged its feet a little bit on boosting I/O because it got burned with SATA ports in the chipsets used with the Sandy Bridge Xeon E5s that came out later than expected in March 2012. Rumors abounded about the difficulties of integrating PCI-Express 4.0 and PCI-Express 5.0 controllers into processors since then.

Generally, a PCI-Express spec is released and then within about a year or so we see controllers embedded in compute engines and network interface chips. So when PCI-Express 4.0 came out in 2017, we saw the first systems using it coming out in 2018 specifically, IBMs Power9-based Power Systems machines, followed by its use in AMD Rome Epyc 7002s launched in August 2019. Intel didnt get PCI-Express 4.0 controllers into its Xeon SP processors until the Ice Lake generation in April 2021.

And even with the short two-year jump to the PCI-Express 5.0 spec in 2019, it wasnt until IBM launched the Power10 processor in its high-end Power E1080 machines in 2021 that it became available in a product. AMD didnt get PCI-Express 5.0 into a server chip until the Genoa Epyc 9004s launched in November 2022 and Intel didnt get PCI-Express 5.0 into a server chip until the Sapphire Rapids Xeon SPs launched in January 2023.

So it was really a three-year cadence between PCI-Express 4.0 and 5.0 products, as expressed in the controllers on the CPUs, even if the spec did a two-year short step.

We think that the specs and the products need to get on a shorter two-year cadence so the compute engines and the interconnects can all be lined up together. And that includes PCI-Express switch ASICs as well, which have traditionally lagged pretty far behind the PCI-Express specs for the 3.0, 4.0, and 5.0 generations that they were widely available.

The lag between PCI-Express ports and PCI-Express switches at any given generation are a problem. That delay forces system architects to choose between composability (which ideally uses PCI-Express switches at the pod level) or bandwidth (which is provided through a direct server slot). Systems and clusters need to be designed with both composability and bandwidth and we would add high radix to the mix as well.

At the moment, there are only two makers of PCI-Express switches, Broadcom (through its PLX Technologies acquisition a number of years ago) and MicroChip. We profiled the MicroChip Switchtec ASICs at the PCI-Express 5.0 level way back in February 2021, which scale from 28 to 100 lanes and from 16 to 52 ports, but as far as we know, they are not shipping in volume. Broadcom unveiled its PCI-Express 5.0 chip portfolio back in February 2022, including the ExpressFabric PEX 89100 switch, which has from 24 to 144 lanes and from 24 to 72 ports. We are confirming if these are shipping as we go to press and have not heard back yet from Broadcom.

Our point is that PCI-Express switches have to be available at the same time that the compute servers, memory servers, and storage servers are all going to be created using chips that support any given level of PCI-Express. On Day One, in fact. You have to be able to embed switches in the servers and not lose bandwidth or ports or sacrifice radix to get bandwidth. We therefore need lots of suppliers in case one of them slips. This is one of the reasons why we were trying to encourage Rambus to get into the PCI-Express switch ASIC racket recently.

All of this is top of mind just as the PCI-SIG has put out the 0.3 release of the PCI-Express 7.0 spec.

Lets take a look at the projections we did for the PCI-Express roadmap a year ago when the PCI-Express 6.0 spec was wrapped up and PCI-Express 7.0 appeared on the horizon:

The PCI-Express 7.0 spec is not expected to be ratified until 2025, and that means we wont see it appearing in systems until late 2026 or early 2027. We think this wait is far too long. We need PCI-Express 7.0 to provide the kind of bandwidth accelerators need to chew on an enormous amount of data that is required to run a simulation or train an AI model. We need it matched up with a fully complex CXL 4.0 specification for shared and pooled memory.

We understand that it would be hard to accelerate PCI-Express 7.0 controllers and switches to market, and that all manner of products would also have to be accelerated. Compute engine and peripheral makers alike would be hesitant to not try to squeeze as much investment as possible out of their PCI-Express 6.0 product cycles.

Still, as PCI-Express 6.0 is put into products and goes through its rigorous testing which will be needed because of the new PAM-4 signaling and FLIT low-latency encoding that it makes use of we think the industry should start accelerating and match up to the CPU and GPU roadmaps as best as possible and to get onto a two-year cadence alongside of them.

Get the components in balance and then move ahead all at once, together.

Go here to see the original:
PCI-Express Must Match The Cadence Of Compute Engines And Networks - The Next Platform

Samsung reveals Q2 2023 earnings, profit nosedives to 14-year low – SamMobile – Samsung news

Last updated: July 7th, 2023 at 06:11 UTC+02:00

Samsung has been going through some tough times. Over the past year, its semiconductor chip business has seen a massive downturn amid global economic woes. Since most of the companys profits usually come from its semiconductor chips business, it has been hit hard. The South Korean firm unveiled its earnings estimate for Q2 2023, and things appear worrying.

Samsung expects Q2 2023 sales to be around KRW 60 trillion (around $45.91 billion), while its operating profit would be around a paltry KRW 0.6 trillion (around $459 million). Thats a whopping 95.74% drop in profit compared to the previous year (Q2 2022), while sales dropped 22.28% from a year ago. This is the second consecutive quarter where the company is reporting worrying figures. Even its Q1 2023 profits dropped 96% compared to Q1 2022.

While the company hasnt revealed its complete and final figures, analysts claim that the poor performance is attributed to the semiconductor division (Samsung Device Solutions). This division is expected to have made losses to the tune of up to KRW 4 trillion ($3.06 billion). Clients arent buying enough memory chips for their cloud servers and high-performance computing machines. Samsung said earlier this year it expects this phase to continue throughout the year.

The companys smartphone division seems to have performed solidly, though. The Galaxy S23 series has been selling really well worldwide. In some countries, the companys high-end phones unveiled earlier this year sold 1.6-1.7x more than its predecessor during the same one-month period. Even in the home appliances and TV segments, it is seeing tough competition and wants these segments to improve its profits.

See original here:
Samsung reveals Q2 2023 earnings, profit nosedives to 14-year low - SamMobile - Samsung news

Twitters new CEO finally explains rate limits & what it means – Dexerto

Joel Loynds

Published: 2023-07-05T13:46:15

Updated: 2023-07-05T13:46:24

The Twitters new CEO, Linda Yaccarino, has broken the tension with new reasoning behind the current reading rate limits, which was also posted on their blog.

After a few days of turmoil behind the scenes at Twitter, Linda Yaccarino, the new CEO of the social media company has come forward and issued a statement, which was coupled alongside a blog post regarding the companys plan with rate-limiting users.

The current owner, Elon Musk, announced on July 1 that the platform would limit how many posts users could see before it locked them from seeing anything else.

Article continues after ad

This led to Twitter essentially DDOS-ing itself and speculation arose around the $1.5 billion bill it might owe in relation to Google Cloud servers.

Yaccarino has broken her silence on the topic, tweeting her support for the action as means of strengthening the platform.

In the tweet, Yaccarino said:

When you have a mission like Twitter you need to make big moves to keep strengthening the platform.

This work is meaningful and on-going.

Ben Collins, a reporter for NBC News questioned the CEO and official blog post:

Subscribe to our newsletter for the latest updates on Esports, Gaming and more.

If this were true, which seems extremely unlikely, why wouldnt you warn users first?

Article continues after ad

Either another lie, or wildly irresponsible, or both.

Meanwhile, the short statement made on Twitters official blog seems to be reiterating what Musk claims is the reasoning behind the limit. The main reason is that people are scraping data from the site without paying for it.

Twitter has recently clamped down on access to its API and began charging for it.

However, recent reports have indicated that since Musks tenure at Twitter, even those paying for the platform have found it broken.

Article continues after ad

Musk has said that the measures are temporary, but weve yet to see the rate limit be alleviated. Meta is currently prepping to launch its competitor, Threads, which launches tomorrow.

Read more here:
Twitters new CEO finally explains rate limits & what it means - Dexerto

Modern eDiscovery Solutions: The Case for the Cloud in 2023 – JD Supra

While the cloud has become commonplace in our everyday lives, many legal teams are still missing out on the benefits of cloud-based ediscovery software.

As of 2021, there is a 50/50 split between ediscovery solutions being on-premise or in the cloud, leaving a large portion of the market still not leveraging the notable benefits of cloud-based ediscovery software.

Despite initial skepticism from others in the legal tech field, Rakesh Rocky Madhava, Founder and CEO of Nextpoint, has always remained a firm believer in the potential of cloud technology. As one of the early pioneers of cloud-based solutions in the legal industry, Nextpoint blazed a trail for other legal tech companies to follow.

Theres ample data that reveals how cloud computing has become a dominant force in todays digital world. Cloud-based collaboration tools have become increasingly popular according to a report by Flexera, 58% of businesses now consider cloud technology to be a strategic asset, and 90% of organizations now use some form of cloud service.

Given the prevalence of cloud-centric technologies in our daily lives, its only natural for the legal profession to catch up and start utilizing cloud-based tools and services to improve efficiency, collaboration, and accessibility. With cloud technology, legal professionals can access important documents and files from anywhere, collaborate with colleagues and clients in real time, and streamline their workflows and manual processes.

You may be wondering: if the cloud is so great, why are so many legal professionals missing out?

Change is Difficult

The cloud, or more specifically cloud computing, refers to the process of accessing resources, software, and databases over the internet and outside the restrictions of local hardware.

There are various reasons as to why legal teams use on-prem ediscovery solutions (data security risks taking first place), and the ability to adapt to change is where many individuals may fall short.

Those who use on-prem solutions may think, This is fine, theres no need to make any changes, to avoid workflow disruption from data migration, implementation, etc. However, limiting yourself to what is relatively fine may actually be holding you back from utilizing the many benefits of cloud-based ediscovery software.

The Importance of Cloud Security

Understanding the security requirements for keeping data safe has become essential as companies continue to migrate toward the cloud. Security threats have become more advanced as our digital landscape continues to evolve, and many of these threats explicitly target cloud computing providers. But protections against these threats have also become more robust in fact, the cloud is generally more secure than on-premise technologies.

For legal teams, choosing a cloud provider with advanced security protocols is essential. Firms are also responsible for implementing their own safeguards against security threats, which includes using secure passwords, limiting account sharing, and only accessing data on private devices. It is imperative to prioritize best practices and cloud security solutions to ensure data protection regardless of the cloud environment your organization operates in whether public, private, or hybrid.

Embracing the Cloud: 5 Advantages of Cloud-Based eDiscovery

1. Increased Scalability

Traditionally, on-premise solutions were the go-to choice for legal professionals, but times are changing. The limitations of relying on locally installed software have become apparent, prompting many legal teams to shift toward the cloud. One of the most compelling advantages of the cloud is its unparalleled scalability.

Unlike on-premise solutions, the cloud can effortlessly handle varying caseloads and data sizes. Whether youre dealing with a small caseload or an extensive document review, the cloud adapts and ensures optimal efficiency no matter the data volume. On-premise solutions, on the other hand, depend on the processing power of the local servers theyre installed on, which are typically no match for the cloud. Furthermore, the cloud is constantly evolving to expand its processing capacity, while local servers rarely undergo this level of advancement.

The clouds lightning-speed processing power has revolutionized the way document review is conducted. In a Nextpoint case study, Hugh Berkson, Principal attorney at McCarthy Lebit Crystal Co., expressed his amazement at the speed in which documents were able to load in our cloud-based ediscovery software. I think the thing that caught me by surprise is the speed with which documents will load when everything is online, Hugh said.

2. Automatic Upgrades and Reliable Service

When it comes to software updates, the reliance on manual software can be a cumbersome and time-consuming process. For on-premises solutions, updates arent available unless an IT team or software provider does them manually. This process is needlessly complex, and it puts you at risk of using outdated software thats no longer supported by the original software developers.

With a cloud-based solution, you can say goodbye to complicated manual updates and enjoy automatic upgrades without the need for an IT team. This not only saves valuable time and money, but also ensures that legal teams are always equipped with the latest features and security improvements.

In addition to the convenience of automatic upgrades, the cloud exhibits an unmatched level of reliability. The clouds reliance on a vast network of public servers creates a robust infrastructure that minimizes downtime. By embracing the cloud, legal teams can enjoy an uninterrupted workflow, maximizing efficiency and minimizing costly interruptions.

For example, in two years between 2015-2017, Amazon Web Services, the backbone of Nextpoints secure hosting infrastructure, experienced a total of only 7.5 hours of downtime (impressive, we know!).

3. Enhanced Security

Historically, security concerns have been a barrier to cloud adoption in the legal industry. However, many cloud providers have risen to the challenge, making significant investments in robust security controls and protocols, dispelling the doubt that once tainted this technology.

When it comes to on-premise, ensuring security becomes a daunting task for those not specialized in IT security. The valuable time and effort spent on securing internal systems often diverts attention away from core functions, limiting productivity. Its unlikely that most legal organizations have an experienced IT team on-site, leaving them vulnerable to potential security threats.

A significant risk associated with on-premise data lies in their physical proximity to the enterprise itself, putting your network at risk of access by unauthorized users, particularly from inside threats.

In contrast, the cloud operates under strict security standards. Top public cloud providers like AWS protect their servers with physical defenses such as fences, guards, biometric devices, and security cameras. Nextpoint also enacts a number of digital protections including data encryption, controlled access, audit trails and two factor authentication.

Many cloud-based companies also undergo external audits for instance, Nextpoint is proudly Soc II Compliant, which means that our auditing procedure ensures we securely manage data to protect the privacy of our clients.

4. Greater Remote Collaboration

In ediscovery, collaboration plays an important role in ensuring seamless document review and production, regardless of physical location. By centralizing ediscovery processes in the cloud, legal professionals can break free from the constraints of location dependent workflows.

Whether team members are located across different offices, cities, or even countries the cloud serves as a unifying effort, fostering collaboration with ease. Documents can be accessed, reviewed, and discussed in real-time, eliminating the need for time-consuming tasks such as passing physical copies of documents or emailing updated versions of work product.

5. Cost Predictability

With the cloud, legal teams no longer have to bear the financial burden of purchasing and maintaining costly infrastructure. The cloud provider takes care of software AND hardware requirements. Legal organizations can then reallocate their funds for other important endeavors.

The cloud also eliminates the need to hire an in-house IT team, which can level the playing field for firms of all sizes and make ediscovery more accessible for smaller firms and organizations operating on limited budgets.

Some legal teams have found the per-Gigabyte pricing that many ediscovery software companies use prohibitive, as they may end up with exorbitant fees from a large dataset. However, Nextpoint introduced our per-user pricing model with unlimited data hosting in 2017 to combat these unaffordable expenses and provide equal access to the cloud for firms of all sizes. While some legal cloud providers fees widely vary, there are options for firms in search of more predictable ediscovery costs.

Changing Rules and Regulations

Parties involved in a case have a duty to maintain the integrity, confidentiality, and security of all potential evidence, and information stored in the cloud is subject to the same requirements. With that being said, there are key issues that should be addressed for organizations utilizing the cloud for ESI.

Where is ESI Located in the Cloud?

Its crucial to know where your data is actually stored. Because cloud providers are expected to provide data at any given time, they may store your data in more than one data center due to repetition purposes.

Be attentive to which ediscovery providers stay up to date with evolving legal standards. Choosing the right provider will mitigate any risk of issues that may arise due to changing data privacy and security laws.

In 2022, Nextpoint launched new developments for legal software in Canada and California amidst recent changes to data privacy and security laws. Although the launch is limited to Canada and California, it has sufficiently prepared us to address restrictions that may emerge in new regions amidst rising concerns over data privacy.

Data security and privacy have always been a top priority here at Nextpoint, and the growing prevalence of consumer privacy laws is changing what this looks like in the ediscovery world. We remain committed to protecting our users data and helping them navigate the complexities of practicing law in a digital age.

eDiscovery and the Cloud Are Evolving Together

The proliferation of cloud storage solutions has allowed individuals to generate and retain vast amounts of data, much of which is now being incorporated into ediscovery processes and presents us with the huge caseloads that we often see today. However, the clouds high processing power can also offer a solution to these challenges.

Many individuals and organizations are now leveraging cloud-based storage solutions (like Dropbox, etc.), and legal tech companies are adapting and adding new features to accommodate these widespread platforms. In Nextpoint, you can now easily drop data from other cloud platforms into your ediscovery software rather than undergo a time-consuming and complicated collection process.

The cloud has both influenced ediscovery and been influenced by it. The emergence of stricter data privacy regulations means were seeing a shift toward greater transparency in data hosting locations. Will this be isolated to ediscovery, or will it permeate the rest of the cloud storage landscape? Only time will tell but its clear that ediscovery and the cloud have become intertwined.

Data permeates every aspect of our lives and plays a pivotal role in litigation. In our constantly evolving digital environment, the cloud stands as the only solution capable of adapting to the dynamic nature of data. It provides the necessary infrastructure to handle vast amounts of information involved in legal proceedings and ensures integration with our evolving data-driven landscape.

Continued here:
Modern eDiscovery Solutions: The Case for the Cloud in 2023 - JD Supra