Page 1,534«..1020..1,5331,5341,5351,536..1,5401,550..»

Three forgotten gems of ‘deep soul’ music from the American South – EL PAS USA

Our new Hidden Gems series aims to educate and entertain readers with excellent but obscure music. We begin this month with an exquisite selection of what is known as deep soul, the gritty sounds of the southeastern United States, where much of the best soul music was made.

The American Deep South includes music hotspots like Memphis, Atlanta, Birmingham and Macon, but not Nashville, which is a little too close to Chicago. The region gave birth to many all-time greats like Little Richard, Otis Redding, Rufus Thomas, Carla Thomas and Al Green. It is a sacred place for soul music, and major record labels like Stax Records (Memphis, Tennessee) and FAME Studios (Muscle Shoals, Alabama) were started there. However, this months hidden gems dont come from any famous names but from three forgotten musicians. They represent the true deep soul, a pure, vibrant and evocative sound that reminds us of the greatness of soul music, that extraordinary genre created from the depths of human emotion.

He could have had it all but died almost penniless. Arthur Alexander pioneered what became known as the Muscle Shoals sound, a fascinating soup of soul, blues and R&B cooked with passion and quality in the FAME studios. Alexander made music before other stars like Aretha Franklin, Wilson Pickett and Candi Staton came along. You Better Move On has hints of the late 1950s when American teenagers spent their time hanging around diners, racing cars and fooling around at summer fairs. With deep and precise tones, Alexander is a southern crooner whose unhurried singing slowly melts his listeners. The music beats like a feverish high-school dance, but the lyrics convey painful feelings of life in the segregated and poverty-stricken South.

Marred by his bipolar disorder, James Carrs career never soared to the heights of contemporaries like Otis Redding or Wilson Pickett. However, You Got My Mind Messed Up proves that he was undoubtedly talented. Carr was turned down by Stax Records, but he landed a recording contract with Goldwax in 1964 and stayed with the label for the rest of his short career. He would spill out his heart in his songs, his breaking voice wailing in true Redding style just listen to Love Attack. This splendid album shows he belongs at the top of the southern soul pantheon, but history relegated him to the shadows.

She grew up singing gospel music but blossomed outside church walls. Duke had a torridly beautiful voice and started out as a backup singer for Nina Simone. In 1969, former Atlantic Records producer Jerry Swamp Dogg Williams Jr. signed her as a solo artist and recorded the unappreciated Im a Loser album. Doggs elegant production meshed perfectly with Dukes melancholic sound, who was likened to a mid-career Franklin, but in a leather jacket, more badass and streetwise. After all, she had grown up in Georgia and knew what it was like to struggle for her daily bread.

Sign up for our weekly newsletter to get more English-language news coverage from EL PAS USA Edition

Here is the original post:
Three forgotten gems of 'deep soul' music from the American South - EL PAS USA

Read More..

Banking in the Cloud: How Financial Institutions Can Mitigate the … – JD Supra

As organizations in the financial sector continue to migrate IT and business services to the cloud and adopt other cloud offerings, it is important that financial institutions understand the risks associated with each. A U.S. Treasury report issued on February 8, 2023, showed that regulators are closely monitoring how the financial sector uses cloud services. With cloud service providers becoming more assertive in shifting risks to their customers, financial institutions may experience higher levels of regulatory scrutiny.

Over the past decade, the financial services sector has steadily migrated many information technology (IT) functions to cloud service providers everything from video teleconferencing to internal communications to customer-facing applications. However, models of adoptions and associated risk vary widely across the sector. As the Financial Services Sector Risk Management Agency , the U.S. Department of the Treasury issued a report on February 8, 2023, assessing these risks and associated challenges affecting the financial sector.

At its most basic level, cloud computing is a means by which organizations can access on-demand network services and infrastructure without having to host their own servers. It is flexible and scalable, so companies can easily add or remove resources as needed. The financial sector, in particular, has found cloud services to be valuable for a range of purposes, such as supporting remote work and using cloud-native capabilities.

Financial institutions are motivated to increase cloud adoption due to benefits such as cost reduction, quicker deployment of new IT assets, faster product and service development, and improved security and resilience. However, these benefits bring with them both risks and other challenges that organizations in the financial sector should consider as they migrate their IT and business functions to the cloud.

In its report, Treasury found as a symptom of rapid adoption of cloud services across the sector the vast majority of financial institutions have implemented cloud services, but at significantly varied maturity levels. A survey by the American Bankers Association (ABA) in 2021 revealed that more than 90 percent of banks surveyed reported maintaining some form of data, applications, or operations in the cloud. Furthermore, more than 80 percent of those surveyed reported being in the early stages of adopting cloud services. Only 5 percent of the surveyed banks described their use of cloud technology as mature.

Various types of cloud offerings public, private, and hybrid exist to cater to diverse service requirements. Public cloud, for instance, allows multiple customers, or "tenants," to share resources. Private cloud, by contrast, is an environment operated exclusively for a single organization, either on or off premises, and allows the cloud to be tailored to meet specific needs, such as security, compliance, or performance. A hybrid model incorporates both public and private cloud services alongside in-house data centers and is the preferred choice for many large financial institutions.

In contrast, some smaller and mid-sized institutions have adopted models using purely public cloud environments, significantly reducing their cost and data center usage but also increasing their risk. If set up properly, public cloud services can offer a resilient and secure setting. However, the level of resilience and security for a specific cloud service may differ dramatically depending on the provider, service, configuration, provisioning, and management. And, importantly, not all of these functionalities may be accessible in every situation.

Treasury has highlighted six primary obstacles to the adoption of cloud technology in the financial industry:

Treasury plans to take a number of steps to assist financial institutions in mitigation risk from the operational disruption of cloud services. As a preliminary step, Treasury plans to establish a Cloud Services Steering Group to address issues raised in this report. The Steering Group's functions will include:

In light of this regulatory focus on how financial institutions are using cloud technologies, financial institutions can take several steps to mitigate the regulatory and security risks associated with cloud adoption.

Read the original:
Banking in the Cloud: How Financial Institutions Can Mitigate the ... - JD Supra

Read More..

Huawei Cloud Backup and Recovery the answer to SA IT risks – ITWeb

The ongoing and potentially worsening load-shedding in South Africa is impacting a great deal more than IT running costs, it is also putting data and business continuity at risk. Moving strategically to the cloud can mitigate these risks.

This is according to Siphiwe Matore, Cloud Solution Architect at Huawei South Africa, who says powering on-premises data centres with generators can cost organisations hundreds of thousands or even millions of rands in diesel and operational staff. Equally concerning is the risk of costly business downtime and the loss of transactional data during power outages.

Load-shedding has made it very clear that keeping business IT operations running is critical for continuity, and just a few hours of downtime can result in the loss of millions of rands. To experience this multiple times a day is something businesses just cant recover from, she says.

Matore says backup power is not the solution as it becomes apparent that load-shedding may drag on for years to come. Relying on UPS devices, inverters and generators is like putting a plaster on the wound, she says. Constantly switching between mains and backup power can result in service interruptions and damage IT hardware in the long run, she notes.

But it is important to note that load-shedding is not the only reason to worry about a company's data protection and IT service availability. Eighty-six percentof South African businesses experienced some form of cyber security attacks in 2022 alone, and now we're the eighth most attacked country in the world, Matore says. Apart from malicious attacks, there are also other threats such as natural disasters like the storms we experienced in KwaZulu-Natal or hardware failures due to degradation or power surges, or even unintentional areas like staff deleting critical files and data. All these threats will be detrimental to businesses and continuing operations. So organisations really need to ask themselves what are they going to do in the event of a disaster?

She says it has become imperative that organisations move to the cloud to ensure that backup and recovery services are easy to use, efficient, reliable and secure, to ensure uninterrupted and ongoing business continuity.

Matore notes that backup and disaster recovery (DR) are distinctly different. Backup is about data protection, where a copy of your data is created to be used in the event of the original data being lost or unavailable. Traditionally, these data copies were written to tapes or removable drives and storage appliances. Now they are increasingly being stored in the cloud, which can be more economical and secure. On the other hand, DR is a set of measures to ensure the timely recovery of data, applications and systems to a separate physical site following a system failure, natural catastrophe or ransomware attack. So if your primary site were to be compromised, the business could seamlessly continue its IT operations from the cloud and mitigate the loss of business operations and money.

She adds: The use of a hybrid cloud is expected to increase by 31% in the next year in South Africa alone, with 53% of companies using backup as a service and 20% using DR as a service. As predicted at the 2019 Cloud Conference and Expo.

Huawei Cloud with the support of its partners is helping customers migrate to the cloud to mitigate the new data loss and continuity risks they are facing.

Huawei Cloud Backup and Recovery (CBR) enables organisations to backup elastic cloud servers (ECSs), bare metal servers (BMSes), elastic volume service (EVS) disks, SFS turbo file systems, local files and directories, and on-premises VMware virtual environments with ease. In case of a virus attack, accidental deletion or software or hardware fault, organisations can restore data to any point in the past when the data was backed up. CBR protects your services by ensuring the security and consistency of your data, she says.

CBR supports crash-consistent backup for multiple disks on a server and application-consistent backup for database servers, ensuring data security and reliability. Incremental forever backups shorten the time required for backup by 95%. With Instant Restore, CBR supports recovery point objective (RPO) as low as one hour and recovery time objective (RTO) within minutes.

With three availability zones in South Africa, Huawei Clouds Storage Disaster Recovery Service (SDRS) provides cross-AZ disaster recovery (DR) protection for your servers, allowing you to achieve a recovery point objective (RPO) of zero through its high-speed network and continuous replication, while greatly reducing total cost of ownership (TCO). If a fault occurs in the source AZ, you can quickly restore services in the target AZ. Equipped with world-class infrastructure and solid power generation capacity, Huawei Cloud has not recorded a second of downtime since its arrival in South Africa three years ago. Huawei Cloud is invested heavily in R&D, capacity, redundancy and specialist technical support teams in the country, to help partners and customers keep systems running and avoid data loss.

** Sources: https://mybroadband.co.za/news/cloud-hosting/306766-cloud-services-in-south-african-companies-here-are-the-latest-stats.html

How vendors and attackers are still exploiting FUD

http://www.securitysa.com/15370r

View original post here:
Huawei Cloud Backup and Recovery the answer to SA IT risks - ITWeb

Read More..

Forget the hybrid cloud; its time for the confidential cloud – VentureBeat

As cloud adoption gains traction, its clear that security teams have been left to play catch up. In diverse hybrid cloud and multicloud environments, encrypting data-at-rest and in-transit isnt enough; it needs to be encrypted in use, too. This is where confidential computing comes in.

Today, The Open Confidential Computing Conference (OC3) gathered together IT industry leaders to discuss the development of confidential computing. Hosted by Edgeless Systems, the event welcomed more than 1,200 attendees, technologists and academics.

Speakers included Intel CTO Greg Lavender and Microsoft Azure CTO Mark Russinovich. They discussed how the role of confidential computing will evolve as organizations migrate to confidential cloud models.

One of the core panel discussions from the event, led by Russinovich, centered on defining what confidential computing is and isnt.

The most succinct definition is the third leg in the data protection triangle of protecting data at rest, protecting data in transit; confidential computing is protecting data in-use, Russinovich said in an exclusive interview with VentureBeat. The data is protected while its being processed.

More specifically, a vendor using confidential computing will create a secure piece of hardware that stores encryption keys within an encrypted trusted execution environment (TEE). The TEE encrypts data and code while in use so they cant be modified or accessed by any unauthorized third parties.

Data in use means that, while an application is running, its still impossible for a third party even the owner of the hardware the application is running from ever seeing the data in the clear, saidMark Horvath, senior director analyst at Gartner.

Encrypting data-in-use, rather than at-rest or in-transit, means that organizations can confidentially and securely process personally identifiable information (PII) or financial data with AI, ML and analytics solutions without exposing it in memory on the underlying hardware.

It also helps protect organizations from attacks that target code or data in use, such as memory scraping or malware injection attacks of the likes launched against Target and the Ukraine power grid.

One of the underlying themes at the OC3 event, particularly in a presentation by Lavender, was how the concept of the confidential cloud is moving from niche to mainstream as more organizations experiment with use cases at the networks edge.

The use cases are expanding rapidly, particularly at the edge, because as people start doing AI and machine learning processing at the edge for all kinds of reasons [such as autonomous vehicles, surveillance infrastructure management], this activity has remained outside of the security perimeter of the cloud, said Lavender.

The traditional cloud security perimeter is based on the idea of encrypting data-at-rest in storage and as it transits across a network, which makes it difficult to conduct tasks like AI inferencing at the networks edge. This is because theres no way to prevent information from being exposed during processing.

As the data there becomes more sensitive particularly video data, which could have PII information like your face or your drivers [license] or your car license [plate] number theres a whole new level of privacy that intersects with confidential computing that needs to be maintained with these machine learning algorithms doing inferencing, said Lavender.

In contrast, adopting a confidential cloud approach enables organizations to run workloads in a TEE, securely processing and inferencing data across the cloud and at the networks edge, without leaving PII, financial data or biometric information exposed to unauthorized users and compliance risk.

This is a capability that early adopters are aiming to exploit. After all, in modern cloud environments, data isnt just stored and processed in a ring-fenced on-premise network with a handful of servers, but in remote and edge locations with a range of mobile and IoT devices.

Organizations that embrace confidential computing unlock many more opportunities for processing data in the cloud. For Russinovich, some of the most exciting use cases are multi-party computation scenarios.

These are scenarios where multiple parties can bring their data and share it, not with each other, but with code that they all trust, and get shared insights out of that combination of data sets with nobody else having access to the data, said Russinovich.

Under this approach, multiple organizations can share data sets to process with a central AI model without exposing the data to each other.

One example of this is Accentures confidential computing pilot developed last year. This used Intels Project Amber solution to enable multiple healthcare institutions and hospitals to share data with a central AI model to develop new insights on how to detect and prevent diseases.

In this particular pilot, each hospital trained its own AI model before sending information downstream to be aggregated within a centralized enclave, where a more sophisticated AI model processed the data in more detail without exposing it to unauthorized third parties or violating regulations like (HIPAA).

Its worth noting that in this example, confidential computing is differentiated from federated learning because it provides attestation that the data and code inside the TEE is unmodified, which enables each hospital to trust the integrity and legitimacy of the AI model before handing over regulated information.

While interest in confidential computing is growing as more practical use cases emerge, the market remains in its infancy, with Absolute Reports estimating it at a value of $3.2 billion in 2021.

However, for OC3 moderator Felix Schuster, CEO and founder of Edgeless Systems, confidential computing is rapidly deepening adoption.

Everything is primed for it, said Schuster. He pointed out that Greg Lavender recently spoke in front of 30 Fortune 500 CISOs, of which only two had heard of confidential computing. After his presentation, 20 people followed up to learn more.

This unawareness is a paradox, as the tech is widely available and amazing things can be done with it, said Schuster. There is consensus between the tech leaders attending the event that all of the cloud will inevitably become confidential in the next few years.

Broader adoption will come as more organizations begin to understand the role it plays in securing decentralized cloud environments.

Considering that members of the Confidential Computing Consortium include Arm, Facebook, Google, Nvidia, Huawei, Intel, Microsoft, Red Hat, EMD, Cisco and VMware, the solution category is well-poised to grow significantly over the next few years.

So far, confidential computing adoption has largely been confined to regulated industries, with more than 75% of demand driven by industries including banking, finance, insurance, healthcare, life sciences, public sector and defense.

As the Accenture pilot indicates, these organizations are experimenting with confidential computing as a way to reconcile data security with accessibility so that they can generate insights from their data while meeting ever-mounting regulatory requirements.

Keeping up with regulatory compliance is one of the core drivers of adoption among these organizations.

The technology is generally seen as a way to simplify compliance reporting for industries such as healthcare and financial services, said Brent Hollingsworth, director of the AMD EPYC Software Ecosystem.

Instead of dedicating costly efforts to set up and operate a secure data processing environment, organizations can process sensitive data in encrypted memory on public clouds saving costs on security efforts and data management, said Hollingsworth.

In this sense, confidential computing gives decision makers both peace of mind and assurance that they can process their data while minimizing legal risk.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

More:
Forget the hybrid cloud; its time for the confidential cloud - VentureBeat

Read More..

How to Apply NIST Principles to SaaS in 2023 – The Hacker News

The National Institute of Standards and Technology (NIST) is one of the standard-bearers in global cybersecurity. The U.S.-based institutes cybersecurity framework helps organizations of all sizes understand, manage, and reduce their cyber-risk levels and better protect their data. Its importance in the fight against cyberattacks cant be overstated.

While NIST hasnt directly developed standards related to securing the SaaS ecosystem, they are instrumental in the way we approach SaaS security.

NIST recently released its Guide to a Secure Enterprise Network Landscape. In it, they discuss the transformation from on-premise networks to multiple cloud servers. Access to these servers, and the accompanying SaaS apps, is through both secure and unsecured devices and locations across disparate geography.

The move to the cloud has effectively obliterated the network perimeter. As a result, companies have increased their attack surface and are experiencing an escalation of attacks that span across network boundaries.

Rather than focus on network-centric security, security must take a three-pronged approach. The user, endpoint, and application are keys to protecting data. This new paradigm emphasizes the importance of identity, location, and contextual data associated with the user, device, and service.

Learn how Adaptive Shield can help enforce NIST compliance.

Todays security tools need to scale to meet the volume, velocity, and variety of todays applications. They need to integrate seamlessly with SaaS applications and provide coverage for the entire SaaS stack.

To be effective, these tools need to minimize human intervention for monitoring and remediation. Automation is critical for an ecosystem that demands secure configurations for each user account that has access to the application. Large organizations may have millions of configurations to secure across their entire SaaS stack; closing them manually is an impossible task.

SaaS security tools must be able to integrate with all the apps on the stack and identify each application through the SaaS apps APIs. Once connected, it must monitor the security configurations, staying alert to any changes. This configuration drift can have severe consequences, as it exposes SaaS applications by removing the safeguards put in place to prevent unauthorized access. It needs to continuously monitor applications, and issue alerts as risk increases.

Effective SaaS security tools use contextual data to detect threats to the application and its data. These threats can come from humans and machines and may have access to the system using verified credentials.

Contextual data from across the SaaS stack can help identify paradoxical travel, spikes in failed authentication attempts from the same IP address for multiple accounts, or attempts where automated tools test weak and common passwords against known user names. It can also recognize malicious third-party applications that are significantly overprivileged for their functionality.

Get a demo of how Adaptive Shield can help secure your SaaS stack

In the world of SaaS, the devices represent the network perimeter. Accessing SaaS applications with devices that have poor hygiene can put all the data at risk. Compromised devices can hand over login credentials to threat actors, who can leverage that into breaching and stealing data.

Effective SaaS security tools partner with endpoint security providers to ensure that the devices that access SaaS apps have an up-to-date operating system, all software has been updated, and any patches have been applied.

While devices may be the perimeter, user ID is the barrier preventing unfettered access to company data. Access should be given using a zero-trust approach. All access should be granted through an SSO connected to an enterprise-managed IdP. Organizations should reinforce this entryway with a phishing-resistant MFA authenticator.

Effective SSPM platforms are built on robust security checks that review each SaaS configuration to ensure they are optimized for protection. Typically, security setting recommendations are influenced heavily by NISTs cybersecurity approach, and their guidance enables SSPM vendors to monitor and track usage, users, and behaviors, as well as identify threats.

See how Adaptive Shields SSPM could protect your SaaS stack

Here is the original post:
How to Apply NIST Principles to SaaS in 2023 - The Hacker News

Read More..

Migrating to the Cloud: Is It as Intimidating as It Appears? – InfoQ.com

Key Takeaways

Being Cloud Native is often considered crucial for business success in the current business landscape. However, the perception of becoming Cloud Native as a drastic change for a business might not necessarily be accurate. In this article, we will delve into the concept of Cloud Migration and its effects on the IT support infrastructure of your business.

Cloud refers to a network of remote servers that are accessible through the internet, where organizations can rent infrastructure and services from a Hyperscaler on a pay-per-use basis.

There are three main types of Clouds:

Cloud Native services refer to applications and services that are built, deployed, and run on the Cloud. These services are designed to take advantage of the features and capabilities offered by the Cloud, such as scalability, high availability, and elasticity.

A Cloud Provider is a company that provides Cloud-based infrastructure and services to organizations. These services can be rented on a pay-as-you-go basis, allowing organizations to save on both the upfront and ongoing costs associated with building and maintaining their own infrastructure.

The Cloud Provider is responsible for hosting the infrastructure and investing in the necessary upgrades and maintenance. This includes investing in hardware, software, and personnel to ensure that the services remain secure and reliable. The costs associated with these investments are passed onto the consumers through operational expenses rather than capital expenses.

In the long run, it is often more cost-effective for organizations to rent the services they need from a Cloud Provider, as it enables them to avoid the upfront costs of building their own infrastructure. Additionally, organizations can scale up or down as needed without incurring the costs of adding or removing physical hardware.

In conclusion, using a Cloud Provider provides organizations with access to a range of IT resources and services without the burden of owning and maintaining the underlying infrastructure.

Cloud Native Services are those services and infrastructure specifically designed to run on cloud platforms, hosted and maintained by Cloud Providers. These services can include a variety of offerings, such as virtual machines (VMs), application servers, VPNs, load balancers, routers, databases, and disk storage. They can be divided into three main categories: compute services, network services, and storage services.

Cloud native services provide a powerful and flexible infrastructure that can help organizations to modernize their applications, improve scalability, and reduce operational costs.

Cloud Migration is about moving all or parts of your data, infrastructure, or compute services to a cloud provider.

Here is an overview of the 6-Rs approach to Cloud Migration planning: Migration Types

Throughout the entire project, it is essential to continuously monitor and manage the cloud environment to ensure that it remains secure, cost-effective, and aligned with the business objectives. This may involve ongoing maintenance and support, as well as periodic optimization and updates to the infrastructure and applications. The typical methodology for a cloud migration involves several key stages, including:

Discovery: Discovery is used to define the business and technical case/scope, plus assets to migrate.

Assessment: Assessment is used to plan the migration and see potential methods of execution.

Migration: Migration is used to run the planned migration steps, both technical and organizational.

Run State: Designated run team maintenance going forward.

When you migrate, what do you typically migrate or set up? The following is a sample of the common ones:

Migration use cases are specific scenarios or situations where a business may need to migrate their IT infrastructure and services to the cloud. These use cases can vary depending on the type of business and the current state of their IT infrastructure. A variety of use cases can drive cloud migration and technical scenarios, including:

Firstly, lets look at some of the options for Data Migration.

Static Data:

File Data:

(No)SQL Data:

Secondly, lets look at some of the options for Network and Topology migration.

Network migration is usually the most manually intensive as it requires mapping physical infrastructure and topology to cloud provider-specific IaaS. A typical approach might be:

Lastly, lets look at some options for Compute Service Migration.

Compute migration deals with migrating application servers, applications, and server clusters to the cloud. It is not just about migrating infrastructure but can also be about refactoring service architectures for cloud as well. Common approaches are:

Rehost:

Re-platform:

Refactor:

Docker and Kubernetes are two critical technologies you need to be aware of to benefit from migrating your applications to the Cloud.

Containerization and Docker:

Kubernetes:

Lets take a closer look at the business impacts of cloud migration, both positive and negative.

Like any organizational change, potential impacts need to be considered.

Positive Impacts:

Negative Impacts:

Business

Technology

Migrating to the cloud may seem intimidating at first, but with careful planning and the right resources, it can be a straightforward and relatively painless experience.

Key things to remember: When looking at Cloud Migration, it is essential to remember the following points and the mental maturity model.

Cloud Services refer to managed infrastructure as a service (IaaS) or platform as a service (PaaS) offerings that are hosted in the cloud. Cloud Migration involves transferring data, infrastructure, and applications to these managed services in the cloud. A successful Cloud Migration strategy must consider technical and business objectives and prioritize incremental migration based on business priorities. Utilizing rehost, re-platform, or hybrid cloud strategies can help minimize the work involved in the migration process. Although refactoring migrated services can help optimize their use of the cloud, it is not an essential step. If you plan it right, you can do it in small bits at a time and still get great benefits.

See the original post:
Migrating to the Cloud: Is It as Intimidating as It Appears? - InfoQ.com

Read More..

Radware introduces a next-gen cloud application security centre – iTWire

COMPANY NEWS: Radware, a leading provider of cyber security and application delivery solutions, introduced a next-generation cloud application security centre in Tel Aviv, Israel, which is home to the company's headquarters.

The opening of the security centre is part of the company's cloud security service growth initiative, which is focused on innovation and scalability. The company also has an existing cloud DDoS scrubbing centre in Israel.

The addition of the new Israeli centre follows the recent rollout of facilities in Australia, Canada, Chile, Italy, New Zealand, Taiwan, and the United Arab Emirates. The facilities are part of Radware's worldwide cloud security service network, which includes more than 50 security centres and delivers an attack mitigation capacity of 12Tbps.

The centres are designed to reduce traffic latency as well as increase service redundancy and mitigation capacity to help customers defend against denial-of-service attacks, web application attacks, malicious bot traffic, and attacks on APIs. They also help increase resiliency and comply with offshore data routing requirements.

"Radware's security centres deliver a global footprint, network and application security, true multi-tenant service, as well as high resiliency," said Trustnet CEO Zion Zvi. Trustnet is a leading integration and consulting company in the field of information and cyber security. "Radware's latest cloud expansion offers a great service to Israeli organisations in need of rapid response times and scalable protection."

The Israeli cloud application security centre supports Radware's 360-degree Cloud Application Protection Service, which spans from the browser side to the server side. The best-of-suite offering includes the company's cloud-based web application firewall, bot manager, API protection, application-layer DDoS protection, and its recently released client-side protection.

To deliver a higher level of application security with lower false positives, the security is based on automated, machine-learning based algorithms that learn legitimate user behaviour and then separate malicious and legitimate traffic. Dozens of Israeli customers rely on Radware's cloud security services.

According to Radware's 2022 Global Threat Analysis Report, the number of DDoS attacks rose by 150 percent compared to 2021. Web application and API attacks increased 128 percent year over year, significantly out pacing the 88 percent increase in attacks between 2020 and 2021.

"Fuelled by the increasing frequency and sophistication of cyber attacks and strong business demand, we continue to expand our global cloud security footprint across major geographies," said Radware vice president of cloud security services Haim Zelikovsky.

"The cloud security centres combine state-of-the-art protection and ultra-high bandwidth performance to defend against the most harmful network and application attacks."

Industry analysts such as Forrester Research, Gartner, GigaOm, KuppingerCole and Quadrant Knowledge Solutions continue to recognise Radware as a market leader in cyber security. The company has received numerous awards for its application and API protection, web application firewall, bot management, and DDoS mitigation solutions.

About Radware

Radware is a global leader of cyber security and application delivery solutions for physical, cloud, and software defined data centres. Its award-winning solutions portfolio secures the digital experience by providing infrastructure, application, and corporate IT protection, and availability services to enterprises globally. Radware's solutions empower enterprise and carrier customers worldwide to adapt to market challenges quickly, maintain business continuity, and achieve maximum productivity while keeping costs down. For more information, please visit the Radware website.

View post:
Radware introduces a next-gen cloud application security centre - iTWire

Read More..

Dell Technologies adds to security lineup – iTWire

Dell Technologies has announced new security services and products, including a mechanism to help detect pre-delivery tampering with hardware.

Dell's new cloud-based version of its Secured Component Verification provides enterprise customers with additional assurance that their PCs are delivered as ordered and built.

Dell generates and securely stores a digital certificate representing key components put into a particular PC at the factory. These certificates can be used to verify component integrity across an entire PC fleet in one go.

Managed Detection and Response Pro Plus is a fully managed security operations solution protecting endpoints, infrastructure, software, hardware and clouds through 24x7 threat detection and investigation; breach and attack simulations; penetration testing; cybersecurity training; and incident recovery care.

Product Success Accelerator for Cyber Recovery helps customers implement and operate an isolated cyber recovery vault to maintain business continuity in the event of an attack.

Three tiers are offered: Ready (implementation assistance), Optimise (quarterly vault assessments and recommendations) and Operate (operational assistance and support).

Security is embedded in Dells DNA. Its built into our designs, infrastructure, supply chain and our products, said Dell Technologies senior vice president of corporate strategy Matt Baker.

Our growing portfolio of security services and solutions is helping organisations tackle their toughest security challenges and addressing the increasing complexity of how they stay secure across networks, devices and systems. Were helping customers double down on resilience in a challenging environment.

Dell Secured Component Verification on Cloud will be available globally for Dell commercial PCs in May.

Dell Managed Detection and Response Pro Plus is now available directly and through channel partners.

Dell Product Success Accelerator for Cyber Recovery is initially available in North America.

Reducing WAN latency is one of the biggest issues with hybrid cloud performance. Taking advantage of compression and data deduplication can reduce your network latency.

Research firm, Markets and Markets, predicted that the hybrid cloud market size is expected to grow from US$38.27 billion in 2017 to US$97.64 billion by 2023.

Colocation facilities provide many of the benefits of having your servers in the cloud while still maintaining physical control of your systems.

Cloud adjacency provided by colocation facilities can enable you to leverage their low latency high bandwidth connections to the cloud as well as providing a solid connection back to your on-premises corporate network.

Download this white paper to find out what you need to know about enabling the hybrid cloud in your organisation.

DOWNLOAD NOW!

Marketing budgets are now focused on Webinars combined with Lead Generation.

If you wish to promote a Webinar we recommend at least a 3 to 4 week campaign prior to your event.

The iTWire campaign will include extensive adverts on our News Site itwire.com and prominent Newsletter promotion https://itwire.com/itwire-update.html and Promotional News & Editorial. Plus a video interview of the key speaker on iTWire TV https://www.youtube.com/c/iTWireTV/videos which will be used in Promotional Posts on the iTWire Home Page.

Now we are coming out of Lockdown iTWire will be focussed to assisting with your webinars and campaigns and assistance via part payments and extended terms, a Webinar Business Booster Pack and other supportive programs. We can also create your adverts and written content plus coordinate your video interview.

We look forward to discussing your campaign goals with you. Please click the button below.

MORE INFO HERE!

View original post here:
Dell Technologies adds to security lineup - iTWire

Read More..

NordVPN releases a free version providing a milestone in personal … – iTWire

Global VPN service provider NordVPN has announced a free version meaning everybody, everywhere can enjoy encrypted, secure communications without worrying about cost.

NordVPN is going through one of its biggest transformations with a recent product update, that includes a new and easy file-sharing functionality through Meshnet as well as an open source Linux application. It also makes Meshnet free of charge for anyone.

Now, even without a NordVPN subscription, users can benefit from all the advantages NordVPNs Meshnet offers them. Meshnet allows users to create a peer-to-peer VPN tunnel between devices with NordVPN installed.

"We are continuously expanding the capabilities of NordVPN. This release marks a significant change in openness by both making part of the service free as well as open-sourcing a substantial part of our client software, said NordVPN product strategist Vykintas Maknickas.

From now on, users no longer need a NordVPN subscription to enjoy all the advantages Meshnet offers. Meshnet creates a peer-to-peer VPN tunnel between desktops and mobile devices. This means users can work around limitations by routing their internet traffic through any remote device in the world, as long as it has NordVPN installed. Any Windows, Mac, or Linux device can turn into their personal VPN server. This means that people can enjoy all the benefits they have at home or provided by other users despite their current location.

In addition, using Meshnet, people can now instantly share files from one device to another. Take a screenshot on your mobile device. Pick the NordVPN app in sharing options and choose the device you want to send your screenshot to. It's that easy. Share photo or video files of unlimited size without losing quality. Users can share any files they want without limitation. Any file sent this way goes through an end-to-end encrypted peer-to-peer VPN tunnel, making it the most secure option to share. Users can send or receive files from their own devices as well as from their family members, friends, or colleagues anyone who has NordVPN installed. The new feature requires two-way consent, so users never have to put their privacy at risk.

Also, NordVPN has released three of its products under an open-source license, which includes the entire NordVPN Linux application, Libtelio a networking library used across NordVPN apps on all operating systems, and Libdrop a library that's used to share files over Meshnet. This means that the source code of these products is now publicly available, allowing developers to review, audit, and contribute to the code. By making these products open source, NordVPN is demonstrating its commitment to transparency, community collaboration, and trust in its products.

NordVPN has updatedits Meshnet documentation page to help you understand and make use of the most popular use cases of this feature. Some examples include:

With Meshnet, users can conveniently bypass internet limitations by routing their internet traffic through any remote device in the world that NordVPN can be installed on. Any Windows, Mac, or Linux device can be used as their personal VPN server. This means that people can enjoy all the benefits they have at home or provided by other users despite their current location.

Meshnet is available on all platforms, including Android, iOS, macOS, Windows, and Linux as well as Android TV. Users can link up to 10 personal and 50 external devices to their own network.

Check out Meshnet here:

Reducing WAN latency is one of the biggest issues with hybrid cloud performance. Taking advantage of compression and data deduplication can reduce your network latency.

Research firm, Markets and Markets, predicted that the hybrid cloud market size is expected to grow from US$38.27 billion in 2017 to US$97.64 billion by 2023.

Colocation facilities provide many of the benefits of having your servers in the cloud while still maintaining physical control of your systems.

Cloud adjacency provided by colocation facilities can enable you to leverage their low latency high bandwidth connections to the cloud as well as providing a solid connection back to your on-premises corporate network.

Download this white paper to find out what you need to know about enabling the hybrid cloud in your organisation.

DOWNLOAD NOW!

Marketing budgets are now focused on Webinars combined with Lead Generation.

If you wish to promote a Webinar we recommend at least a 3 to 4 week campaign prior to your event.

The iTWire campaign will include extensive adverts on our News Site itwire.com and prominent Newsletter promotion https://itwire.com/itwire-update.html and Promotional News & Editorial. Plus a video interview of the key speaker on iTWire TV https://www.youtube.com/c/iTWireTV/videos which will be used in Promotional Posts on the iTWire Home Page.

Now we are coming out of Lockdown iTWire will be focussed to assisting with your webinars and campaigns and assistance via part payments and extended terms, a Webinar Business Booster Pack and other supportive programs. We can also create your adverts and written content plus coordinate your video interview.

We look forward to discussing your campaign goals with you. Please click the button below.

MORE INFO HERE!

Read the original post:
NordVPN releases a free version providing a milestone in personal ... - iTWire

Read More..

Forget HTTP: Ethereum has a new URL standard that cant be blocked – Cointelegraph

Web3 URLs enabled with the launch of ERC-4804 have made it onto Ethereum, allowing internet users to access Ethereum applications and nonfungible tokens (NFTs)without worrying about centralized censorship.

The new Ethereum standard, titled Web3 URL to EVM Call Message Translation, was first proposed on Feb. 14, 2022, and was co-authored by ETHStorage founder Qi Zhou, Ethereum researcher Sam Wilson, and Chao Pi.

It described the proposal as an HTTP-style URL to directly access on-chain Web3 content, such as decentralized applications (DApps) front-ends and NFTs. More than a year later, ERC-4804 was approved and finalized on the mainnet on March 1.

Anthurine Xiang, a spokesperson for layer-2 storage protocol ETHStorage, explained that in many cases, the ecosystem still relies on centralized web servers to access decentralized apps.

Right now, all the DApps like Uniswap [...] claim to be decentralized apps,Xiang explained, adding: But how [do] we get on the webpage? You have to go through the DNS. You have to go through GoDaddy. [...] All those are centralized servers.

Today, most users access the internet via Hypertext Transfer Protocol, widely known as HTTP.

When an internet user clicks a link or types in a website address, the computer uses HTTP to ask another computer to retrieve the information, such as a website or pictures.

Under ERC-4804, internet users have the option to type in web3:// (as opposed to http://) in their browsers to bring up DApps such as Uniswap or on-chain NFTs directly. This is because the standard allows users to directly run a query to the Ethereum Virtual Machine (EVM).

Entire websites can theoretically be accessed by these means as long as their content is stored on the Ethereum blockchain or a compatible layer-2 protocol. However, the costs of doing this are still very prohibitive, according to ETHStorage founder Qi Zhou.

The critical issue here is that the storage cost on Ethereum is super, super expensive on mainnet, Zhou said in a recent presentation at ETHDenver.

For example, 1 Gigabyte of on-chain data will cost roughly $10 million. [...] That is unacceptable for a lot of Web2 applications and even a lot of NFTs, Zhou added, noting that layer-2 storage solutions could help mitigate some of the costs.

Xiang suggested that, given the costs, the new URL standard makes sense only for specific applications.

On the other hand, the new standard would be useful for DApps or websitesat risk of censorship, with Tornado Cash as an example.

For example, for Tornado Cash, a lot of people cant get to them through their website because theres censorship, Xiang explained.

If youre a DApp and youve already been decentralized, why are you still using a centralized website for people to get access to you"https://s3.cointelegraph.com/uploads/2023-03/f32c2728-a3ce-4b7b-8cf6-ced309f67666.PNG" alt="" title="" width="4" height="3" layout="responsive" >

Asked whether the new standard could be leveraged by bad actors to partake in illicit activity, Xiang said:

Instead, Xiang believes, like Bitcoin, theyre just giving people a decentralized option they may not have otherwise.

The new Ethereum standard is the first of its kind for the blockchain, noted Xiang, though its not the first solution to decentralized web hosting.

Related: How to host a decentralized website

The InterPlanetary File System (IPFS) is an example of a network created to do through decentralized means, what centralized cloud servers currently provide. However, Xiang noted that an IPFS URL can only link to static content, which cant be amended or changed.

ERC-4804 will allow for dynamic data, such as allowing people to leave likes and comments and interact with content on a website, explained Xiang. Being Ethereum native, the standard is also expected to be able to interact with other blockchains much easier, Xiang added.

Original post:
Forget HTTP: Ethereum has a new URL standard that cant be blocked - Cointelegraph

Read More..