Page 1,080«..1020..1,0791,0801,0811,082..1,0901,100..»

Outlook on the Oman Data Center Market to 2028: Increasing Demand for Migration to Cloud Platforms by Various Sectors Attracts Local and Global Cloud…

Company Logo

Omani Data Center Market

Omani Data Center Market

Dublin, June 09, 2023 (GLOBE NEWSWIRE) -- The "Oman Data Center Market - Investment Analysis & Growth Opportunities 2023-2028" report has been added to ResearchAndMarkets.com's offering.

The Oman data center market is expected to grow at a CAGR of 9.60% from 2022-2028.

Key Highlights

The increasing demand for migration to cloud platforms by various sectors attracts local and global cloud service providers in the Oman data center market. For instance, Global Cloud Service operators such as AWS, Microsoft, Oracle, and Google have a presence with local operators in Oman.

In December 2022, the Ministry of Transport, Communications and Information Technology (MTCIT) in Oman signed a Memorandum of Understanding (MoU) with Amazon Web Services (AWS) for the launch of cloud service data centers in the country.

Regarding 5G connectivity, telecom operators like Omantel, Ooredoo, Nokia, Ericsson, and Vodafone are working towards deploying 5G technology in the country.

The government has launched its strategies, such as Vision 2040 and National Energy Strategy, which aim to generate around 30% of the electricity from renewable energy sources by 2030.

In August 2022, the Ministry of Housing and Urban Planning, part of Oman Vision 2040, was involved in developing three smart cities in the country. The smart cities are expected to be developed in Salala, Nizwa, and Suhar in Oman.

Reasons to Buy

Market size available in the investment, area, power capacity, and Oman colocation market revenue.

An assessment of the data center investment in Oman by colocation, hyperscale, and enterprise operators.

Investments in the area (square feet) and power capacity (MW) across states in the country.

A detailed study of the existing Oman data center market landscape, an in-depth industry analysis, and insightful predictions about industry size during the forecast period.

Snapshot of existing and upcoming third-party data center facilities in Oman

The Oman data center market investments are classified into IT, power, cooling, and general construction services with sizing and forecast.

A comprehensive analysis of the latest trends, growth rate, potential opportunities, growth restraints, and prospects for the industry.

Business overview and product offerings of prominent IT infrastructure providers, construction contractors, support infrastructure providers, and investors operating in the industry.

A transparent research methodology and the analysis of the demand and supply aspects of the industry.

Story continues

VENDOR LANDSCAPE

The Oman data center market has the presence of several local and global operators, such as Equinix, Ooredoo, Datamount, Cloud Acropolis, and Oman Data Park.

The prominent growth potential of the industry has also attracted several new entrants; for instance, Gulf Data Hub is expanding its footprints across the Middle Eastern region, where the company also has two pipeline projects in Oman.

EXISTING VS. UPCOMING DATA CENTERS

REPORT COVERAGE:

This report analyses the Oman data center market share. It elaboratively analyses the existing and upcoming facilities and investments in IT, electrical, mechanical infrastructure, general construction, and tier standards. It discusses market sizing and investment estimation for different segments.

The segmentation includes:

IT Infrastructure

Servers

Storage Systems

Network Infrastructure

Electrical Infrastructure

UPS Systems

Generators

Switches & Switchgears

PDUs

Other Electrical Infrastructure

Mechanical Infrastructure

Cooling Systems

Racks

Other Mechanical Infrastructure

Cooling Systems

CRAC & CRAH Units

Chiller Units

Cooling Towers, Condensers & Dry Coolers

Other Cooling Units

General Construction

Core & Shell Development

Installation & Commissioning Services

Engineering & Building Design

Fire Detection & Suppression Systems

Physical Security

Data Center Infrastructure Management (DCIM)

Tier Standard

Tier I & Tier II

Tier III

VENDOR LANDSCAPE

Major Vendors

IT Infrastructure Providers

Data Center Construction Contractors & Sub-Contractors

AECOM

DC Pro Engineering

Direct Services

Hill International

Turner & Townsend

Support Infrastructure Providers

Data Center Investors

Equinix

Ooredoo

Oman Data Park

Datamount

CloudAcropolis

New Entrants

KEY QUESTIONS ANSWERED:

How much is the Oman data center market investment expected to grow?

What is the growth rate of the Oman data center market?

How many data centers have been identified in Oman?

What are the driving factors for the Oman data center market?

Who are the key investors in the Oman data center market?

Key Attributes:

Report Attribute

Details

No. of Pages

92

Forecast Period

2022 - 2028

Estimated Market Value (USD) in 2022

$247 Million

Forecasted Market Value (USD) by 2028

$428 Million

Compound Annual Growth Rate

9.6%

Regions Covered

Oman

Key Topics Covered:

1. Existing & Upcoming Third-Party Data Centers in Oman1.1. Historical Market Scenario1.2. 5+ Unique Data Center Properties1.3. Data Center It Load Capacity1.4. Data Center White Floor Area Space1.5. Existing Vs Upcoming Data Center Capacity by Cities1.6. Cities Covered1.6.1. Muscat1.6.2. Other Cities1.7. List of Upcoming Data Center Projects in Oman

2. Investment Opportunities in Oman2.1. Microeconomic and Macroeconomic Factors in Oman 2.2. Investment Opportunities in Oman2.3. Investment by Area2.4. Investment by Power Capacity

3. Data Center Colocation Market in Oman3.1. Colocation Services Market in Oman3.2. Retail Vs Wholesale Data Center Colocation3.3. Colocation Pricing (Quarter Rack, Half Rack, Full Rack) & Add-Ons

Link:
Outlook on the Oman Data Center Market to 2028: Increasing Demand for Migration to Cloud Platforms by Various Sectors Attracts Local and Global Cloud...

Read More..

BEYOND LOCAL: Computer scientist explains why blockchain is a … – The Longmont Leader

The following article, written by Yu Chen, Binghamton University, State University of New York originally appeared on The Conversation and is published here with permission:

People hear a lot about blockchain technology in relation to cryptocurrencies like bitcoin, which rely on blockchain systems to keep records of financial transactions between people and businesses. But a crash in public trust in cryptocurrencies like TerraUSD and therefore a massive drop in their market value doesnt mean their underlying technology is also worthless.

In fact, there are plenty of other uses for this type of system, which does not rely on centralized storage and where many people can participate securely, even if they dont all know each other.

As a computer scientist exploring new technologies for future smart communication network technologies, I, along with many engineers and developers, have shown that blockchain technology is a promising solution to many challenging problems in trust and security of next-generation network-based applications. I see several ways blockchains are proving themselves useful that arent tied to cryptocurrency.

Modern global supply chains require a huge amount of information for the massive number of products being shipped around the world. They suffer from limits on data storage capacity, inefficient paper processes, disjointed data systems and incompatible data formats. These traditional centralized data storage methods cannot efficiently trace the origin of problems, like where a poor-quality product came from.

Storing information on a blockchain improves integrity, accountability and traceability. For example, IBMs Food Trust uses a blockchain system to track food items from the field to retailers. The participants in the food supply chain record transactions in the shared blockchain, which simplifies keeping track.

Data ownership and privacy are top concerns in the health care industry. Current centralized systems cannot meet all the diverse needs of patients, health service providers, insurance companies and governmental agencies. Blockchain technology enables a decentralized system for access control of medical records where all stakeholders interests are protected.

Blockchain systems not only allow health care service providers to securely share patients medical records but also enable patients to track who has accessed their records and determine who is authorized to do so.

Banking and finance benefit from integrating blockchain networks into their business operations. Instead of trying to develop cryptocurrencies with new or different capabilities, the financial sector has recognized that blockchain systems are a reliable way to store information about traditional currencies like the dollar, euro and yen, as well as financial products.

Blockchains provide consumers with the convenience of being able to monitor their transactions as they are processed, almost in real time from anywhere. Banks also benefit from blockchains, with the opportunity to conduct business between institutions more efficiently and securely.

Todays manual process of recording property rights is burdensome and inefficient. Traditional paper documentation is time-consuming, labor intensive, not transparent and vulnerable to loss. Blockchain technology eliminates inconvenience, inefficiency and errors, and reduces the cost by migrating the entire process into a digital form.

Blockchain systems allow owners to trust that their deed is accurate and permanently recorded. Remote access is particularly meaningful to people living in areas without sufficient governmental or financial infrastructure.

Validating votes and maintaining voter privacy seem like conflicting requirements. Blockchain systems hold promise as a means to facilitate a fair and transparent modern voting system. Because its almost impossible to tamper with a blockchain-enabled voting system, it can maintain a transparent electoral process.

In the November 2018 midterm elections in West Virginia, a blockchain-based voting system was used and found to be secure and reliable.

A smart city embeds information and communication technologies into its facilities, infrastructure and services to provide its residents a convenient, intelligent and comfortable living space. A smart city is essentially a network of many devices that can communicate with each other to share data. Connected devices can include peoples smartphones, vehicles, electrical meters, public safety monitoring systems and even homes.

These systems have performance, security and privacy requirements that centralized information systems cannot handle. Blockchain is a key networking technology for building smart cities because its able to optimize operations, enhance security guarantees and increase mutual trust among participants.

The future of information technology is all about decentralization. Todays centralized architecture fails to meet the increasingly diverse needs of people who want freedom to personalize their own services, control their digital assets and more easily participate in democratic processes. Blockchain is a key enabling technology for building any secure and durable decentralized information system.

Yu Chen, Professor of Electrical and Computer Engineering, Binghamton University, State University of New York. This article is republished from The Conversation under a Creative Commons license. Read the original article.

Excerpt from:

BEYOND LOCAL: Computer scientist explains why blockchain is a ... - The Longmont Leader

Read More..

Serverless is the future of PostgreSQL – InfoWorld

PostgreSQL has been hot for years, but that hotness can also be a challenge for enterprises looking to pick between a host of competing vendors. As enterprises look to move off expensive, legacy relational database management systems (RDBMS) but still want to stick with an RDBMS, open source PostgreSQL is an attractive, less-expensive alternative. But which PostgreSQL? AWS was once the obvious default with two managed PostgreSQL services (Aurora and RDS), but now theres Microsoft, Google, Aiven, TimeScale, Crunchy Data, EDB, Neon, and more.

In an interview with the founder and CEO of Neon Nikita Shamgunov, he stressed that among this crowd of pretenders to the PostgreSQL throne, the key differentiator going forward is serverless. We are serverless, and all the other ones except for Aurora, which has a serverless option, are not, he declares. If hes right about the importance of serverless for PostgreSQL adoption, its possible the future of commercial PostgreSQL could come down to a serverless battle between Neon and AWS.

In some ways, serverless is the fulfillment of clouds promise. Almost since the day it started, for example, AWS has pitched the cloud as a way to offload the undifferentiated heavy lifting of managing servers, yet even with services like Amazon EC2 or Amazon RDS for PostgreSQL, developers still had to think about servers, even if there was much less work involved.

In a truly serverless world, developers dont have to think about the underlying infrastructure (servers) at all. They just focus on building their applications while the cloud provider takes care of provisioning servers.In the world of databases, a truly serverless offering will separate storage and compute, and substitute the databases storage layer by redistributing data across a cluster of nodes.

Among other benefits of serverless, asAnna Geller, Kestras head of developer relations, explains, serverless encourages useful engineering practices. For example, if we can agree that its beneficial to build individual software components in such a way that they are responsible for only one thing, she notes, then serverless helps because it encourages code that is easy to change and stateless. Serverless all but forces a developer to build reproducible code. She says, Serverless doesnt only force you to make your components small, but it also requires that you define all resources needed for the execution of your function or container.

The result: better engineering practices and much faster development times, as many companies are discovering. In short, there is a lot to love about serverless.

Shamgunov sees two primary benefits to running PostgreSQL serverless. The first is that developers no longer need to worry about sizing. All the developer needs is a connection string to the database without worrying about size/scale. Neon takes care of that completely. The second benefit is consumption-based pricing, with the ability to scale down to zero (and pay zero). This ability to scale to zero is something that AWS doesnt offer, according to Ampt CEO Jeremy Daly. Even when your app is sitting idle, youre going to pay.

But not with Neon. As Shamgunov stresses in our interview, In the SQL world, making it truly serverless is very, very hard. There are shades of gray in terms of how companies try to deliver that serverless promise of scaling to zero, but only Neon currently can do so, he says.

Do people care? The answer is yes, he insists. What weve learned so far is that people really care about manageability, and thats where serverless is the obvious winner. [It makes] consumption so easy. All you need to manage is a connection stream. This becomes increasingly important as companies build ever-bigger systems with bigger and bigger fleets. Here, Its a lot easier to not worry about how big [your] compute [is] at a point in time. In other systems, you end up with runaway costs unless youre focused on dialing resources up or down, with a constant need to size your workloads. But not in a fully serverless offering like Neon, Shamgunov argues. Just a connection stream and off you go. People love that.

Not everything is rosy in serverless land. Take cold starts, for example. The first time you invoke a function, the serverless system must initialize a new container to run your code. This takes time and is called a cold start. Neon has been putting in a non-trivial amount of engineering budget to solving the cold-start problem, Shamgunov says. This follows a host of other performance improvements the company has made, such as speeding up PostgreSQL connections.

Neon also uniquely offers branching. As Shamgunov explains, Neon supports copy-on-write branching, which allows people to run a dedicated database for every preview or every GitHub commit. This means developers can branch a database, which creates a full copy of the data and gives developers a separate serverless endpoint to it. You can run your CI/CD pipeline, you can test it, you can do capacity or all sorts of things, and then bring it back into your main branch. If you dont use the branch, you spend $0. Because its serverless. Truly serverless.

All of which helps Neon deliver on its promise of being as easy to consume as Stripe, in Shamgunovs words. To win the PostgreSQL battle, he continues, You need to be as developer-friendly as Stripe. You need, in short, to be serverless.

Originally posted here:
Serverless is the future of PostgreSQL - InfoWorld

Read More..

Cloud Storage Market is driven by increasing demand for storage … – Digital Journal

PRESS RELEASE

Published June 8, 2023

New York, According to a new market research report published by Global Insight Services Cloud Storage Market is expected to grow by 2031. Report includes in-depth segmentation and market size data by categories, product types, applications, and geographies. Report also includes comprehensive analysis of key issues, trends and drivers, restraints and challenges, competitive landscape, as well as recent events such as M&A activities in the market.

Download Sample of Report : https://www.globalinsightservices.com/request-sample/GIS21647

Cloud storage is a service that allows users to save files online so that they can be accessed from any internet-connected device. The files are stored on servers that are owned and operated by the cloud storage provider, and users can access their files via a web interface or a mobile app. Cloud storage is convenient for users because it allows them to access their files from anywhere, and it is also more secure than storing files on a local hard drive because the files are stored in a remote location and are backed up by the cloud storage provider.

Key Trends

There are four key trends in cloud storage technology:

1. Increased use of object storage: Object storage is a type of data storage that stores data as objects, with each object consisting of a data file and its associated metadata. Object storage is becoming increasingly popular for storing data in the cloud, as it is more scalable and easier to manage than traditional file storage systems.

2. Increased use of file-based storage: File-based storage systems store data as files, with each file being stored on a separate server. File-based storage is becoming increasingly popular for storing data in the cloud, as it is more scalable and easier to manage than traditional block storage systems.

3. Increased use of cloud-native storage: Cloud-native storage is a type of storage that is designed specifically for use in the cloud. Cloud-native storage is more scalable and easier to manage than traditional storage systems, and it is becoming increasingly popular for storing data in the cloud.

4. Increased use of flash storage: Flash storage is a type of storage that uses flash memory to store data. Flash storage is more expensive than traditional storage systems, but it is faster and more durable. Flash storage is becoming increasingly popular for storing data in the cloud.

Key Drivers

There are several key drivers of the cloud storage market. One of the most important drivers is the increasing demand for storage capacity. With the growth of digital content, more businesses and individuals are needing to store large amounts of data. Cloud storage provides a scalable and flexible solution to this problem.

Another key driver is the need for data backup and disaster recovery. Cloud storage can provide a reliable and cost-effective way to store critical data off-site. This can be especially important for businesses that cannot afford to lose data due to a natural disaster or other catastrophic event.

Finally, the increasing use of mobile devices is also driving the cloud storage market. More people are using smartphones and tablets to access data and applications. This trend is only expected to continue, which means that businesses will need to find ways to store and manage mobile data. Cloud storage is a great option for this purpose.

Get A Customized Scope to Match Your Need Ask an Expert : https://www.globalinsightservices.com/request-customization/GIS21647

Market Segments

The Cloud Storage Market is segmented into component, deployment mode, user type, and region. Depending on component, it is bifurcated into solution and services. On the basis of deployment mode, it is classified into private, public, and hybrid. The user type segment includes large enterprises, and small & medium size enterprises. Region-wise, the market is segmented into North America, Europe, Asia-Pacific, and the rest of the World.

Key Players

The Cloud Storage Market report includes players such as Alibaba Group Holding Limited, Amazon Web Services, Cisco Systems, Inc., Dell Technologies Inc., Google LLC, Hewlett Packard Enterprise Development LP, International Business Machines Corporation, Microsoft Corporation, Oracle Corporation and Rackspace Hosting, Inc.

Reasons to buy Cloud Storage Market Report:

Develop comprehensive understanding of market landscape industry structure, value-chain, key players, trends, drivers, and challenges Drive revenue and market-entry strategy by gaining insights into which segments and geographies are largest and likely to grow fastest Formulate sales and marketing strategies by gaining understanding of competitors, their positioning, and strengths & weaknesses Develop business and M&A strategies through understanding of latest trends and emerging players in the market Refine your business plans by understanding impact of disruptions such as Covid-19 and Russia-Ukraine conflict on the market

New Report Published by Global Insight Services : https://www.globalinsightservices.com/reports/hydrogen-projects-database/

About Global Insight Services:

Global Insight Services (GIS) is a leading multi-industry market research firm headquartered in Delaware, US. We are committed to providing our clients with highest quality data, analysis, and tools to meet all their market research needs. With GIS, you can be assured of the quality of the deliverables, robust & transparent research methodology, and superior service.

Contact Us:

Global Insight Services LLC16192, Coastal Highway, Lewes DE 19958E-mail: [emailprotected]Phone: +1-833-761-1700Website: https://www.globalinsightservices.com/

Read the original post:
Cloud Storage Market is driven by increasing demand for storage ... - Digital Journal

Read More..

A data privacy law is becoming more urgent every day – The Dallas Morning News

Last month, the Irish Data Protection Commission (DPC), a leading European Union privacy regulator, issued a landmark ruling suspending Metas cross-border transfer of personal data from its EU users to its U.S.-based servers. The DPCs ruling includes a record $1.3 billion fine and could potentially shutter access to Facebook and Instagram across the EU.

The consequences, though, could extend far beyond Meta. In its ruling, the DPC held that businesses cannot rely on the so-called Standard Contractual Clauses, a legal framework used by Meta and others to transfer data from the EU to the U.S. in compliance with European privacy law. That framework is used by more than 90% of businesses engaged in transatlantic commerce, a nearly $7.1 trillion industry.

A disruption in transatlantic data flows would not only take a bite out of the economy, but also impact any company doing business in the EU which relies on U.S.-based software or cloud hosting.

The DPCs ruling against Meta is the latest in a long line of high-profile EU decisions that raise fundamental questions about whether European privacy law permits any type of transatlantic data transfer given the potential for U.S. intelligence agencies to surveil personal data held by American companies.

Opinion

Get smart opinions on the topics North Texans care about.

In other words, the concern driving the EU to ban data transfers to companies like Meta closely resembles the concern driving Americans to ban TikTok: That a foreign government may be able to surveil personal data held by a foreign company.

The upshot of this overlap aside, perhaps, from a gut check that Europeans view the U.S. like Americans view China is that both problems can be addressed through a single solution: a comprehensive federal privacy law.

This law can empower consumers to manage their own data in a way that minimizes what information businesses can collect or retain and, by extension, what governments can glean from those businesses. This would solve the problem from the bottom up, with consumers themselves limiting what data is potentially subject to corporate misuse or government surveillance.

The U.S. has much to gain from passing a comprehensive federal privacy law. Such a law would not only strengthen consumer privacy (by giving consumers more control) and protect national security (by reducing what data companies like TikTok can collect), but also provide American businesses with much-needed regulatory certainty.

The latter is particularly salient given the DPCs recent ruling against Meta. It is not tenable for American businesses to operate in a state of limbo as to whether their services or software will one day be banned in Europe or other privacy-conscious jurisdictions. And European policymakers have indicated that passing a comprehensive U.S. privacy law is the best way to stabilize EU-U.S. data flows over the long term.

Indeed, the European Commission is currently rushing to finalize a new transatlantic data transfer mechanism known as the EU-U.S. Data Privacy Framework. This new framework accounts for recent changes in U.S. surveillance policy made to address EU concerns. Yet these changes have been poorly received by European policymakers because they are the product of executive rather than legislative action, and the U.S., unlike most developed countries, lacks a comprehensive privacy law that enshrines baseline protections.

In the short term, these concerns should not stop the European Commission from finalizing the new framework to sustain data flows in the wake of the DPCs Meta decision. The commission has already indicated that, at least provisionally, the new framework adequately addresses concerns about mass surveillance, and a $7.1 trillion economic disruption would harm both the U.S. and Europe.

Even if the framework is finalized, though, it may not deliver the long-term certainty that American businesses need. Like its predecessors, it will face almost immediate legal jeopardy unless or until the United States passes a comprehensive privacy law.

Rarely can a single act of legislation serve the interests of industry, consumers and national security. A federal privacy law would do just that, but the clock is ticking.

Jordan E.M. Sessler serves as privacy counsel for BigCommerce, a Texas-based ecommerce software provider. The views expressed are his own and not those of BigCommerce. He wrote this column for The Dallas Morning News.

We welcome your thoughts in a letter to the editor. See the guidelines and submit your letter here.

Follow this link:
A data privacy law is becoming more urgent every day - The Dallas Morning News

Read More..

Understanding cybersecurity issues that keep organisations on their … – Gulf Business

Image credit: Supplied

There has been exponential technological advancement in recent years, bringing with it immense cybersecurity challenges. In todays world, a large amount of data is being transmitted and accessed out of offices as most employers deploy a remote or a hybrid workforce. This data mobilisation has raised serious concerns regarding cybersecurity and encryption.

According to a recent report by cybersecurity firm NortonLifeLock, the average cost of cybercrime in the UAE is estimated to be around $2.6m per incident. This represents a 17 per cent increase from the previous years average cost of cybercrime in the UAE.

Remote work and increasing reliance on cloud-based solutions has resulted in a massive shift in how data is utilised, accessed and transmitted.

However, with employees now working virtually from anywhere, newfound security challenges have emerged.

In addition, vulnerabilities related to unencrypted USBs and cloud storage, and limitations of software encryption have also highlighted shortfalls of traditional security measures.

Hackers are quick to exploit weaknesses in organisations, and are continuously seeking opportunities to intercept and compromise enterprise data security. It is not uncommon for employees to carry sensitive organisational information through USBs.

Many companies also carry the practice of connecting USBs to critical information systems for transferring and storing data. This makes unencrypted USBs prone to several risks and vulnerabilities.

Data from the 2022 Honeywell Industrial Cybersecurity USB Threat report indicates that 52 per cent of threats were specifically designed by malicious actors to utilise removable media, clearly indicating that the threats linked to removable drives have escalated.

The high portability and small size of USBs makes them easy to lose data. Infected USB disks also make it highly easy for attackers to gain access to systems through an infected device and introduce malware into company networks, giving rise to increased ransomware attacks.

It is surprising that despite the vast amount of information available, cyberthreats are constantly increasing and evolving. Ease of availability, low cost and easy accessibility of flash drives make them a popular choice for business despite the numerous vulnerabilities they bring with them.

Software encryption allows an additional layer of protection for drives by encrypting data stored on devices. However, this type of encryption has its limitations and may be less reliable compared to hardware encryption methods and can be easily removed by the user.

Software encryption also requires software updates and causes delays in read and write speeds which makes it difficult for businesses to decide and strike a balance between security and performance.

Hardware encrypted USB drives have a built-in encryption within the drive, and are designed to avoid tampering. They have custom architectures which incorporate an onboard encryption con-troller and access control, and use a digitally-signed firmware that protects the authenticity and integrity of the drive. They also limit password attempts and comply to industry best practices to ward of malicious actors.

Cloud storage is a popular choice for storing and accessing data remotely. However, despite the convenience and ease of access, data stored in the cloud always carries a risk.

Cloud data can be accessed from anywhere; local coffee shops, internet cafes and restaurants, or even co-working spaces. Hackers can target cloud servers, potentially compromising sensitive information stored in the cloud.

Furthermore, entrusting data to a third party may complicate the process of identifying and mitigating data breaches, making it harder to determine what information has been leaked.

Encrypted USBs provide a secure alternate solution to access files and information without com-promising an organisations cloud. Hardware encrypted devices have to meet tough security standards and thus offer the improved data protection that helps manage, contain and reduce cyber risks.

It is imperative that organisations prioritise data security and invest in comprehensive data security solutions by employing a holistic approach that encompasses hardware encryption, education of employees on best practices, and working closely with trusted cloud service providers to mitigate risks.

By adopting a multi-layered approach through increased user awareness, businesses can protect their valuable information and maintain the trust of their customers and stakeholders in this ever-evolving digital landscape.

Antoine Harb is the team leader Middle East at Kingston Technology

Go here to see the original:
Understanding cybersecurity issues that keep organisations on their ... - Gulf Business

Read More..

Securing Your CI/CD Pipeline: Exploring the Dangers of Self-Hosted Agents – Security Boulevard

Continuous Integration/Continuous Deployment (CI/CD) pipelines have become crucial to modern software development practices. CI/CD pipelines can significantly improve development efficiency and software quality by automating the process of building, testing, and deploying code. Most modern CI/CD platforms (like GitHub actions, Circle CI, etc.) offer an option to run the pipeline process over a self-hosted runner an agent hosted by the user instead of the CI/CD platform, to execute jobs on their own infrastructure.

With self-hosted runners, you can create custom hardware configurations that meet your needs with processing power or memory to run larger jobs. Additionally, you can install software available on your local network and choose an operating system not offered by the platform. Self-hosted runners can be physical, virtual, in a container, on-premises, or in a cloud environment. The alternative is to use a build-as-a-service runner offered by SaaS CI/CD platform where the user has no control over the environment.

One might use self-hosted runners for several reasons:

Compliance and privacy: many organizations are bound to strict regulations that prevent them from building and sending sensitive data to SaaS build services. For example, organizations from the financial or gov sectors often prefer running the build on their internal servers.

Special specifications: when building software for uncommon operating systems or different architectures, the build process needs a machine with specs that match their requirements. This is also the case in many embedded, IoT, or client-side applications.

Cost saving: organizations with existing cloud infra / private cloud can save costs if they choose to use self-hosted runners and manage resource pools themselves.

Given the reasons above, self-hosted runners have gained popularity in many organizations. However, self-hosted runners can pose significant security risks, especially if untrusted workflows run on them. Malicious programs may compromise the build machine, and escape from the runners sandbox, exposing access to the machines network, secrets, and more.

Using a self-hosted runner can offer advantages, such as improved performance and increased environmental control. However, there are also security risks associated with it:

Self-hosted runners require dedicated servers or virtual machines (VMs) to operate. The security of these servers is crucial. If the server hosting the runner is not properly secured, it can become a potential entry point for attackers.

Self-hosted runners typically require privileged access to the repository and the system on which they are installed. Its important to carefully manage access control and restrict the permissions granted to the runner. Unauthorized access to the runner could lead to potential data breaches or malicious code execution.

Failure to isolate self-hosted runners from critical systems and sensitive data increases the risk of a compromised runner leading to unauthorized access and potential data breaches. Without proper isolation, attackers who gain control of the runner may be able to move laterally within the network, potentially compromising other systems and data.

Neglecting to regularly update and monitor the self-hosted runner software exposes the system to known vulnerabilities and leaves it susceptible to attacks. Without timely security patches and updates, attackers can exploit known weaknesses in the runner software. Insufficient monitoring increases the chances of undetected malicious activities and delays in responding to security incidents, potentially resulting in extended periods of unauthorized access or damage.

Using a self-hosted runner without restricting access to authorized users allows anyone to exploit compute resources for activities like crypto-mining and other activities that can inflict financial damage.

GitHub Actions have gained significant adoption in recent years as a powerful CI/CD automation tool for software development. With GitHub Actions, developers can automate tasks like building, testing, and deploying software, allowing them to focus on more important tasks. GitHub Actions offers a wide range of customization options, including pre-built actions, community-built actions, and the ability to create your own actions. As more and more organizations adopt GitHub Actions, it is becoming an increasingly important tool for modern software development.

GitHub also provides the option to use self-hosted runners. You can add self-hosted runners at various levels in the management hierarchy, including repository-level runners dedicated to a single repository, organization-level runners that can process jobs for multiple repositories in an organization, and enterprise-level runners that can be assigned to multiple organizations in an enterprise account.

During our research, weve scanned over three million public repositories. For each repository, weve inspected the different GitHub Actions files and extracted the runner used by each job. We have learned that out of 3,106,445 public repositories checked, 43,803 are using self-hosted runners. Using a self-hosted runner in a public repository increases the aformantioned risks dramatically, as any GitHub user in the world could initiate an attack.

GitHub provides organizations with several options to configure their self-hosted runners. Users need to verify that their organization is following configuration best practices to prevent potential risks.

Runner groups:When a self-hosted runner is defined at the organization or enterprise level, GitHub can schedule workflows from multiple repositories onto the same runner. Consequently, a security compromise in these environments can result in a wide impact. To help reduce the scope of a compromise, you can create boundaries by organizing your self-hosted runners into separate groups.

Disallowing public repositories:If you use self-hosted runners with public repositories and do not properly configure access controls, an attacker can potentially gain access to your self-hosted runner machine by creating a pull request that runs a malicious workflow. This can allow the attacker to execute arbitrary code on your machine, access sensitive data, or perform other malicious actions, depending on the level of access your self-hosted runner machine has.

For example, an attacker could modify the code in a pull request to execute commands that install malware on your self-hosted runner machine or steal sensitive data from your organization. This can lead to serious consequences, including data breaches, loss of intellectual property, and damage to your organizations reputation.

To avoid the risk, its advised to use self-hosted runners only with private repositories. You can limit the attack surface and reduce the risk of unauthorized access to your self-hosted runner machine.

Access control:You can restrict what organizations and repositories can access runner groups. Additionally, you can restrict the runners to specific workflows. For more information, see Managing access to self-hosted runners using groups.

In a blog post posted by @ryotkak, we can see an example of how self-hosted runners impose risk and, if used incorrectly, can lead to sensitive information theft. ryotkak shows an example of workflow created by GitHub. The vulnerable workflow is part of the actions framework itself.Under specific conditions, the self-hosted runner was reachable to any attacker. The token stolen in that example was a token owned by a developer from GitHub itself which attackers could have leveraged to carry a wide-impact software supply chain attack.

In this demo, we show how we created a public GitHub repository with a self-hosted runner. This example shows an external malicious collaborator performing several attacks.

In the first image, we can see the runner pwn_me is added to the runners group pool.

Our next step was to configure a workflow that runs on our self-hosted runner and is triggered by the pull_request trigger:

jobs: hellow_world: runs-on: self-hosted steps: - uses: "actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b" with: fetch-depth: 0 - name: dummy_action run: echo "hello world" && ls -la

Now, all an attacker needs to do is create a fork with a modified workflow file that retrieves the runners secrets and sensitive information. An example of a job created by a malicious contributed:

This video shows how we created a pull request from a fork and gained access to the /etc/passwd file as an example.

The following best practices are relevant to all CI/CD self-hosted platforms. Whether you use GitHub Actions Jenkins, GitLab, or any other platform, these guidelines will help keep your self-hosted runners safe and your software supply chain secure:

Make sure no sensitive information is stored over the runner. For example, private SSH keys, and API access tokens, among others.

Make sure the runner has minimum network access to other internal assets. For example, Azure or AWS management consoles or metadata services. Management consoles and metadata services are panels the cloud vendor provides to retrieve cloud environment management or perform information. An attacker with access to these resources can change the victims cloud configuration or extend their attack surface. The amount of sensitive information in this environment should be minimal. You should always be mindful that any user capable of invoking workflows has access to this environment.

Ensure all services inside the runner are up-to-date and free from known vulnerabilities.

Avoid persistency make sure each job runs on a clean, fresh agent container/VM

If using containers, run at minimum permissions.

Avoid running containers in privileged mode.

Make sure the container is mounted to dedicated devices and does not share resources with the host.

Run only a single container per host.

Use HTTPS for communication: Use HTTPS to encrypt communication between your self-hosted runner and GitHub, and avoid using unencrypted HTTP.

Overall, self-hosting CI/CD pipelines can be risky if not properly managed. To minimize the potential dangers, its essential to follow security best practices, such as regularly updating self-hosted agents, implementing access controls, and monitoring for suspicious activity. Additionally, organizations should consider using cloud-based CI/CD services that can provide better security and reliability than self-hosted solutions.

The Legit Security platform connects to your build environment to detect attack attempts in your pipeline and much more. We identify different misconfigurations, such as a public repository that uses a self-hosted runner. If you are concerned about these security risks and others across your software supply chain, pleasecontact us torequest demo on our website.

*** This is a Security Bloggers Network syndicated blog from Legit Security Blog authored by Nadav Noy. Read the original post at: https://www.legitsecurity.com/blog/securing-your-ci/cd-pipeline-exploring-the-dangers-of-self-hosted-agents

Original post:
Securing Your CI/CD Pipeline: Exploring the Dangers of Self-Hosted Agents - Security Boulevard

Read More..

When and When Not to Choose Colocation Over a Private Data Center – Data Center Knowledge

Table of Contents:

Once upon a time, every business that needed a data center built and operated it itself.

Related: 5 Factors to Consider When Choosing a Colocation Provider

Those days are gone. Today, companies can take advantage of colocation, a practice that allows them to deploy IT equipment in data centers owned and managed by a third-party provider.

That doesn't mean, though, that a colocation data center is always the best choice. Keep reading for an overview of how colocation works, along with tips on deciding when it does and doesn't make sense to migrate workloads to a collocation facility instead of creating your own data center to house them.

Related: Colocation Managed Services: Pros and Cons for Providers

Colocation, or colo for short, is the use of third-party data center facilities to house IT equipment. Companies that use colocation choose a data center operated by a colocation provider, then set up their own servers or other infrastructure inside one of the provider's facilities.

Typically, a business that uses a colocation facility is responsible for purchasing, deploying, and maintaining its own infrastructure inside that facility, although the colocation provider may offer managed services to cover some or all of those tasks. Either way, the colocation provider handles management of the physical facility, including keep it powered, ensuring physical security, providing access to network infrastructure, and so on.

Importantly, colocation is different from using a public cloud. In the cloud, workloads are hosted on infrastructure that is fully managed by a cloud provider. That's different from colocation, where you usually rent data center space from a third-party provider, but not the infrastructure.

Compared with operating your own data center, a colocation data center offers several benefits:

The list of colocation benefits could go on, but these points summarize the main advantages.

On the other hand, colocation can have its drawbacks:

Do the benefits of a colocation data center outweigh the drawbacks? The answer depends, of course, on what your use cases and priorities are.

Colocation is typically a better approach for companies whose workloads are small in scale and are not likely to grow significantly. If you don't have enough servers to fill up your own data center, it makes more sense to host them in a colocation facility.

Colocation is also a benefit to companies with limited IT staffs, since the colocation model effectively makes it possible to outsource some data center management tasks to a third-party provider.

And if you need to deploy infrastructure in several disparate locations, colocation is often the most cost-effective way to do it. That's a benefit if, for example, you have users concentrated in different geographic areas and you want to host your workloads in close geographic proximity to each of them.

But if you have a very large-scale infrastructure or you only need to operate your servers in one specific area, colocation probably doesn't make as much sense to you. You're likely to be better off by investing in your own data center.

See original here:
When and When Not to Choose Colocation Over a Private Data Center - Data Center Knowledge

Read More..

Google puts $1M behind its promise to detect cryptomining malware – The Register

Google Cloud has put $1 million on the table to cover customers' unauthorized compute expenses stemming from cryptomining attacks if its sensors don't spot these illicit miners.

Unlike their louder, flashier counterparts (looking at you, ransomware crews), cryptominers are stealthier. Once they've broken into a victims' compute environment often via compromised credentials they keep quiet, deploying mining malware and then raking in cryptocurrencies using the stolen compute resources.

This goes on until they get caught, which usually happens when a victim notices other legit workloads' performance lagging while their computing costs spike.

Plus, according to security researchers, illicit mining is on the rise. Google's Cybersecurity Action Team found that 65 percent of compromised cloud accounts experienced cryptocurrency mining [PDF].

The chocolate factory is confident that it can promptly detect and stop these attacks, and to that end it is adding cryptoming protection with up to $1 million to cover unauthorized Google Cloud compute expenses associated with undetected cryptomining attacks for its Security Command Center Premium customers.

Security Command Center is Google Cloud's built-in security and security and risk-management platform, and the new service scans virtual machine memory for mining malware. In a blog post today, Google Cloud's Greg Smith and Tim Peacock describe the cryptomining detector thus:

And, if this doesn't protect the cloud security product's premium customers, then Google will reimburse them up to $1 million.

Earlier this year, security researchers uncovered a sneaky mining botnet dubbed HeadCrab that uses bespoke malware to mine for Monero crytocurrency and infected at least 1,200 Redis servers in the last 18 months.

The compromised servers span the US, UK, German, India, Malaysia, China and other countries, according to Aqua Security's Nautilus researchers, who discovered the HeadCrab malware and have now found a way to detect it.

Based on the attacker's Monero wallet, the researchers estimate that the crooks expected an annual profit of about $4,500 per infected worker.

Read more:
Google puts $1M behind its promise to detect cryptomining malware - The Register

Read More..

Mitigating load shedding with cloud impacts business operations – IT-Online

Load shedding poses significant challenges for businesses, disrupting operations, and hindering productivity. However, the cloud offers a viable solution to mitigate the impacts of power outages.

By Garry Ackerman, CEO of Argantic

By embracing cloud-based solutions for email and file sharing, productivity platforms, and data centres, companies can ensure uninterrupted access to essential services, enhance collaboration, and safeguard critical data.

As load shedding continues to affect the economy, embracing the cloud becomes an imperative for businesses seeking resilience and continuity in the face of power disruptions.

This article explores how businesses are leveraging cloud-based solutions, specifically in email and file sharing, productivity platforms, and data centres, to mitigate the impacts of load shedding.

Email and file sharing in the cloud

One of the primary concerns during load shedding is communication and collaboration. Traditional on-premises email servers and file-sharing systems are heavily reliant on a consistent power supply. When power outages occur, businesses lose access to critical communication channels, hindering employee productivity and impeding collaboration among teams.

To address this challenge, companies are migrating their email and file-sharing systems to the cloud. Cloud-based email solutions, such as Microsoft 365 (formerly Office 365) and Google Workspace, offer several advantages over their on-premises counterparts.

These platforms are hosted in data centres with robust power backup systems, ensuring uninterrupted access to email services even during load shedding events. Employees can continue to send and receive emails, access shared files, and collaborate seamlessly, regardless of power disruptions at their physical locations.

Related

Read the original:
Mitigating load shedding with cloud impacts business operations - IT-Online

Read More..