Page 2,253«..1020..2,2522,2532,2542,255..2,2602,270..»

Will the security benefits of cloud computing outweigh its risks in 2022? – Bobsguide

Stakeholders and executives of financial organisations remain on the fence about whether the advantages of cloud computing outweigh the potential risks of trusting sensitive information to remote servers. With the current demands on banks IT infrastructure and front-, middle-, and back-office staff, and the implementation of Basel IV pushed forward to January 1, 2023, this year may be a good time to transition ever-growing IT infrastructure to the cloud.

Cloud computing is becoming increasingly attractive toand indispensable forfinancial organisations. The cloud has the potential to completely change the financial services landscape. Banks can take advantage of cloud technologies to improve their entire risk management systems and to access fast, high-end technologies on an as-needed basis. As a result of switching to cloud computing, many services can be delivered with reduced up-front capital outlay and IT expenses.

The current state of cloud computing allows financial organisations to access any modern core banking system offering without any loss in cost-effectiveness. This not only enables banks to save costs, but also increases data processing speed and improves the quality of the financial services they provide.

Despite possible initial hurdles in implementing cloud technologies, such as security risks, reliability issues, and problems with business continuity planning, the extra flexibility and scalability provided by the cloud far outweigh the negative aspects. If an organisation can ensure effective corporate governance and security by performing vigorous endpoint management and IT policy management, the cloud will provide many security benefits.

Some IT professionals still overlook the fact data can be more secure in the cloud than in a physical data center. They continue to see data which has been stored in the cloud as a vulnerable asset, raising security, privacy, and compliance concerns.

It is true some engineers are so focused on getting to the cloud they do not initially put the time into setting up security, governance, and auditing. In the best-case scenario, the organisation only has a permissions nightmare to deal with, even though incorporating proper governance will still be a painful and expensive process. In the worst case, neglecting security in a rush to the cloud can result in a data breach or the deletion of all of IaC (Infrastructure as Code to automate cloud resource deployments) and backups.

The cloud is very different from a traditional data center, and banks need to approach their data management differently as a result. Otherwise, the cloud could end up being an extra expensive data center should financial firms choose to throw their legacy technology into it.

Cloud computing has the resources to ensure high levels of security and prevent data breaches, but it is imperative an organisation implement vigorous endpoint management and IT policy management to gain the maximum benefit.

Unlike traditional data centers, which typically rely on physical defenses to prevent unauthorized access to data, public clouds, such as Amazon Web Service or Microsofts Azure, allow server-side 256-bit encryption to protect files. These files remain encrypted when they are transferred within the network or saved to cloud storage.

Data objects sent to the cloud server by the client/user are also deduplicated and compressed. In this case, if a third party were to gain access to the data, they would be forced not only to decrypt the objects without the AES (Advanced Encryption Standard) 256-bit encryption key, but also to uncompress and reassemble them into readable files.

When high-performance access to a file is required, the cloud infrastructure can be modified accordingly by deploying virtual or physical cache servers. As with traditional file servers and NAS (Network-Attached Storage) devices, these servers cache only the active files needed for local, high-speed access, thus reducing storage needs and costs.

Cloud storage data and metadata are encrypted and unavailable in their at-rest format, so a cache server is required to access them. This server, in turn, provides its own additional security, such as closed unused protocol ports, no open back-end access, additional encryption between the client and the directory server, and self-encrypting drives.

The same reliable authentication procedures and access tools as in an on-premises data center can be used for cloud deployments. For instance, access to remote data can be provided though standard file sharing protocols such as SMB (Server Message Block) 1, 2, and 3 or NFS (Near-Field Communication) v3 and v4, in exactly the same way as if traditional file servers or NAS (Network-Attached Storage) devices were used.

Additionally, AD (Active Directory) permissions, which are controlled by the banks system administrator, manage data access. An authenticated user can access only the data that is visible to them, and the rest of the data is protected through group- or user-specific policies. Moreover, the support of Active Directory trust relationships allows the creation of logical links and the application of policies between users and domains within the system.

The cloud easily surpasses the capabilities of traditional data storage when it comes to the protection of data against accidental or intentional mistakes and system failures which would otherwise lead to data corruptions.

Writing data to cloud storage is done using a WORM (Write Once Read Many) model, in which new data is always appended (added to the existing one) and never replaced or overwritten. The system creates snapshots of data at assigned intervals in order to be able to instantly recover any set of data in case any server-side or related problems occur.

Third party regulations and certifications ensure data is secure. All public clouds, such as AWS, Azure, or GCP, are required to go through extensive third-party certifications, e.g., HIPAA, HITECH, Soc2, PCI, and ITAR, to ensure all data is properly protected.

Consequently, they meet important audit and compliance requirements. Should a financial institution transfer its data to the cloud, it will meet all these requirements automatically. Should a financial institution transfer its data to the cloud, it will meet all these requirements automatically.

In the past, many data and file security solutions (such as firewalls and antivirus software only supported traditional NAS (Network-Attached Storage) software to detect and stop cyber threats. Today, the same integration capabilities are available when using cloud-based file storage.

Cloud solutions now allow high levels of flexibility when it comes to integration. This provides banks with the ability to find and isolate sensitive data, visualise data access, adopt and manage a least privilege access model, and streamline compliance activities.

Moreover, it allows unstructured data to be securely stored by financial institutions in public or on-premises cloud storage, where the cache server, as an extra layer of protection, processes the actively used data whenever high-performance access is required.

Working with on-premises deployment creates a false sense of security because of the perception the network itself is protected by a physical boundary. However, only the most sensitive networks operate in an air-gap mode without any outside access. Of course, providing remote access opens systems up to certain cybersecurity risks, but in the cloud, there is also less risk of misconfiguration, and all those risks are more easily mitigated by using standard security infrastructure and features, and standard security audit tools.

While cybersecurity risks exist in both on-premises and cloud environments, cloud systems are better protected than on-premises or data center deployments. It is notable many of the recent major hacks occurred in on-premises networks or hybrid environments rather than in purely cloud-based systems.

An optimally running cloud solution reduces cybersecurity risks through the use of a standard set of cloud services and technologies, which present less penetration risk than non-standard on-premises or hybrid networks.

Banking risk management functions will receive tangible benefits from cloud computing, but leaders of banks risk departments still face significant challenges when migrating to the cloud. With the increased number of cloud adoptions in finance, the importance of day one security, governance, and auditing should not be downplayed by a financial organizations management. Failing to take these factors seriously will undoubtedly lead to the disruption of business operations and could damage the organisations reputation owing to financial and legal issues.

To prevent disasters and secure a banks data in the cloud more effectively, they should set up multiple layers of security. For large banks and other financial organisations, it is better to set up risk management functions with a private cloud provider. Small- and medium-sized businesses, on the other hand, would benefit from taking advantage of the public cloud service providers in order to grow their business and connect data securely. For highly secure operations, it is better to use a private cloud. If you use a public cloud for the upper layer of your organisations operations, a hybrid cloud solution might also be a good option.

Moreover, hosting a cloud storage system in your own data center within a security perimeter can be just as efficient for your organisation. Private cloud solutions deployed in a private data center possess all the benefits of public clouds, including 256-bit encryption, compression, deduplication, and modular building blocks that can scale at a comparatively low cost.

By partnering with CompatibL, financial institutions can ensure they are always in control of their sensitive corporate and private information, and are compliant with the current and upcoming regulatory capital requirements.

Link:
Will the security benefits of cloud computing outweigh its risks in 2022? - Bobsguide

Read More..

Google Cloud takes a gap year. It may come back with very different ideas – The Register

Opinion Taking a look at the latest financial results from Google/Alphabet made some of us do a double-take ... and not because of the $40bn+ in ad revenue.

If you read closely, you'll see that Google Cloud has lessened its habitual loss by extending the operational lifespan of its cloud servers by a year, and stretching out some of its other infrastructure for longer.

Google is not a bit player on the market; it could push ahead with its upgrade cycle with some adjustments if it wanted to, and say as much, but no. It is opting out

So what, you might say, wearily playing along in the office with hardware that gets refreshed less often than an octogenarian teetotaller. But this is Google Cloud, one of the headline players in the most important enterprise IT market of our time.

If it's saying that it's improving its competitive offer by not bothering to upgrade its core CPU farm, that says a lot about the cloud, the processor market, and the future of both.

You can see the cloud as it is sold to you, easing the capex/opex ratio, adding flexibility, dialled-in scale, and performance while reducing managerial overhead. In a different light, it's also a fantastic experiment in abstracting what IT actually means in business: paying other people to worry about all the boring stuff on your behalf.

Security, energy, hardware tending, meeting demand at a global scale or just giving you an instant few cores of server to run up an idea or proof of concept without you having to buy so much as a multiway plug.

So when Google says in effect it doesn't care about upgrading CPUs this time around, you can believe it. Issues like the chip shortage and global economic uncertainty will factor into the decision, but reports from the front line of the server industry indicate that if you've got the clout, you get your share. Google is not a bit player on the market; it could push ahead with its upgrade cycle with some adjustments if it wanted to, and say as much, but no. It is opting out.

This is even more significant because Google is one of the most processor-focused providers. It reveals the processors it uses for different classes of task, sometimes even letting you pick the ones you want, and sometimes they'll even be in the region and the available configuration that you fancy.

Compare that to the choices offered by Amazon AWS EC2, which are number of cores per instance and whether you want multithreading. That's it, and that's much more typical of cloud service providers (CSPs). For most workloads, these firms don't compete on CPU. Storage tiers get the works with latency versus capacity versus cost, but compute performance? Acceptable is good enough. You will get virtual machines running on virtual CPUs, and you will like it.

This leaves the chip companies with some hard questions. They really can't shake the "performance" metric as the drug of choice, and it's still an easy sell to investors.

Headline numbers look good, HPC is always a happy place to be, and you can find plenty of other places where you need lots of performance grunt. General-purpose CPUs have to face off against GPUs and other hardware-optimised silicon there although massively parallel tasks mostly don't care about x86's legacy.

And, as Apple has proven with its M1 architecture, x86 legacy doesn't have to count for that much elsewhere these days. It's not that CSPs and data centres are gagging for M1s, which work so well because they are so highly evolved for Apple's market.

The x86 emulation overhead is perfectly bearable there while the ecosystem catches up with native versions; acceptable is good enough, and the path forward is clear.

But CSPs aren't gagging for the latest x86 magic either; they'll happily take it at the right price and at the right time, but they'll leave it for a while too. That's a gap, which is suddenly much more interesting. MacBook owners like battery life, but CSPs really don't like the new era of accelerating energy costs.

The ARM-ification of servers at scale has been predicted a few times now, although it's never been quite clear how you get there from here. The M1, however, is a great proof of concept: and the energy bills, specifically, are a great motivator to pay attention.

It is easy now to imagine what the M1's cloud-component cousin would look like. It could be a system-on-chip with a set of computing cores that are intrinsically efficient and can be even more so with the right workload, very tightly coupled to IO and integrated memory, but instead of being tuned for an Apple machine, the SoC would work very well for a particularly configured VM and work acceptably for others. There would be nothing here that would be beyond the talents of a competent design team, no matter where they work.

With a sea of these, a CSP could have a new, performant, and very competitively priced tier that rewards workloads that are optimised for the native, highly efficient modes, but one that would remain competitive for the older tasks that would otherwise be happy running on the older hardware already in the racks.

The CSPs would get enough wriggle room to price-nudge the clientele into the low-energy workload domain while still picking up a bit more margin.

The world has already moved into the sort of containerised, multi-platform, open-dev, automated regime with the necessary tools and techniques for making apps for such an architecture. That means not much novel engineering would be needed at the codeface.

The motivation is there, the methods are at hand, and the barriers to transition are much reduced. Maybe Google's gap year is an indication that business will not resume as usual.

Read more:
Google Cloud takes a gap year. It may come back with very different ideas - The Register

Read More..

Software supply chain: the problem with outsourcing everything – Evening Standard

I

n the last decade, the global tech landscape has developed at lightning speed, and many of us are struggling to keep up. Most of us have got to grips with the basics Facebook, Slack, Zoom but terms like the cloud are harder to understand.

Far from a niche area of tech, the cloud is becoming a crucial tool for businesses, often with unwanted implications.

Every day, organisations looking to modernise their data practice are moving to cloud hosting and Software-as-a-Service (SaaS) based products. For those not well-versed: traditionally, a website would be hosted on a single server, usually in a data centre. Cloud hosting, on the other hand, sees a companys data distributed across different servers, usually in different places, which are all connected to form a network. This network is called the cloud.

Many companies migrate to these systems as theyre easy to manage and can integrate with other complementary products across an organisation, whilst also receiving updates for the software in real-time.

Across any given industry there are a small set of market leaders who provide the software, and the largest organisations in the world will naturally prefer to use these top solutions. They are easy to trust, have strong industry presence, and justification for procuring them is smooth for internal budget holders and external shareholders.

What we dont always think about are the macro-implications: now we have a large number of organisations who are heavily reliant on a single organisation to provide a common product.

Take a look at any accounting software used across major companies. If its managed by a cloud-based third-party on behalf of the company, then a single attack to that software provider would not only impact the accounting system of the individual company, but others who use that same product.

Global industries putting all their software eggs in one vendors basket is a very attractive proposition for an attacker: a single attack can scale across many different companies.

In the case of Ransomware, attackers have asked themselves the following economic question: why spend time managing and attacking 100 firms individually, receiving only a small amount of capital from each of them, when I can look to the biggest business-to-business software providers in an industry and target those, charging a huge amount to restore their systems so they can continue to deliver services to their clients?

(European Space Agency)

Their answer has been clear: weve started to see a global increase in software supply chain attacks. The impact on business, from down-time and lost revenue, is in the billions of dollars.

But what can organisations do? Look towards the space agencies, for starters.

Redundancy and failover the ability to switch automatically and seamlessly to a reliable backup system have always been key concepts in space missions, ensuring that everything is built to a high standard, but its assumed that what can go wrong, will go wrong.

What does that mean for the rest of us that spend most of our time below the outer atmosphere? First, organisations need to have catalogued all their software products, mapping these to their dependent business functions. We need to know which are mission critical for us to continue trading, as these create the greatest business risk. These should be prioritised, back-up providers evaluated, and a redundancy plan put in place to ensure any impact is minimal in case of a failure.

This needs to be built from the ground up. When organisations run their procurement process for these mission critical systems, a back-up provider should be identified and a failover deal should be negotiated in case the primary supplier goes down.

Lastly, but something which is often overlooked, back-ups need to be stored in a format usable by a different product, instead of being tied to a single product.

Attackers gravitate towards the greatest reward for their input, and supply chain attacks on software providers are an attractive way to scale the impact of their work. Fortunately, organisations can put in place sufficient failovers to mitigate many of the risks they face when their supply chain becomes the target.

Read more:
Software supply chain: the problem with outsourcing everything - Evening Standard

Read More..

Mimecast : How Secure Is the Cloud with Cloud Security Tools? – marketscreener.com

Organizations everywhere are turning to cloud computing to reduce costs and improve mobility, flexibility and collaboration. Despite rapid adoption, however, 96% of cybersecurity professionals say they are at least moderately concerned about the security of cloud computing, according to a report from ISC2.[1]

How secure is cloud computing? And what can organizations do to fortify it? Answering these questions begins with understanding common cloud computing vulnerabilities and the cloud security policies, processes and tools to reduce them.

Cloud computing enables the delivery of computing services on demand over the internet. For businesses, these services can range from databases and storage to customer intelligence, data analytics, human resources platforms and enterprise resource planning. Cloud computing is attractive to many organizations because it can provide significant cost savings - organizations typically subscribe to and pay only for the cloud services they use, which can save them time and money otherwise spent on infrastructure and IT management.

The other benefit of cloud computing is enhanced security. In most cases, the cloud is more secure than on-premises data centers. When a company operates and manages its own on-premises data center, it's responsible for procuring the expertise and resources to appropriately secure its data from end to end. Cloud-based providers, however, offer a higher level of security than many businesses can match or could afford, particularly for growing organizations or ones with limited financial resources.

While organizations can benefit from improved security by migrating to the cloud, that doesn't mean they're free from threats. Importantly, cloud security is a shared responsibility between cloud service providers and their customers. Discussed below are some of the top risks that a cloud environment poses and what organizations can do to protect against these vulnerabilities:

Misconfiguration Creates Most Cloud Vulnerabilities

While cloud service providers often offer tools to help manage cloud configuration, the misconfiguration of cloud resources remains the most prevalent cloud vulnerability, which can be exploited to access cloud data and services, says the U.S. National Security Agency.[2] Misconfiguration can impact organizations in many ways, making them more susceptible to threats like denial of service attacks and account compromise.

Poor Access Control Gives Attackers Privileges

Poor access control results when cloud resources use weak authentication methods or include vulnerabilities that bypass authentication methods. This can allow attackers to elevate privileges and compromise cloud resources.

Employees Pose Risks

Companies that have difficulty tracking how employees are using cloud computing services risk becoming vulnerable to both external attacks and insider security threats. End users can access an organization's internal data without much trouble, so they can steal valuable information or be exploited by attackers to do similar harm.

Insecure APIs Are Becoming a Major Attack Vector

Many APIs require access to sensitive business data, and some are made public to improve adoption. APIs that are implemented without adequate authentication and authorization, however, pose risks to organizations. Insecure APIs are becoming a major attack vector for malicious actors.

Since cloud security is a shared responsibility between the cloud provider and the customer, sharing arrangements need to be well understood. While a provider would typically be responsible for safeguarding the infrastructure, patching and configuring the physical network, for example, its customer's responsibilities could include managing users, their access privileges and data encryption. The following cloud security tools help organizations fortify their environment:[3]

Why Cloud Security Policies Are Important

A cloud security policy is a formal guideline developed to ensure safe and secure operations in the cloud. Without one, a company risks security breaches, financial and data loss, and other costly consequences including fines for regulatory noncompliance.

A cloud security policy should include:

Cloud computing can provide important opportunities and cost savings for organizations. While security remains a prevalent concern, understanding the most common threats and putting in place the proper policies, processes and tools can help companies protect themselves and their data.

[1] "2021 Cloud Security Report," ISC2

[2] "Mitigating Cloud Vulnerabilities," National Security Agency

[3] "What Is Cloud Security?", IBM

Continued here:
Mimecast : How Secure Is the Cloud with Cloud Security Tools? - marketscreener.com

Read More..

ClearOne : AV Practitioner’s Guide to the Cloud – Part 3 of 3 – marketscreener.com

Part 3: What AV Practitioners Need to Know When Incorporating Cloud into AV/IT Infrastructure

In Part I and Part II of the Guide to the Cloud series, we discussed the benefits of AV integrators using the cloud - from enabling remote management of AV/IT infrastructures to saving companies money and, in turn, creating revenue. However, this Cloud implementation can take on many different roles, and the degree of manual assistance is entirely reliant on the level of usage or assigned role, site setup required, software and installation assistance, and degree of customization desired.

For example, AV practitioners that want the cloud software to simply take on a read-only supervision role just need computer operation information, an email address, and details on website browser usage. On the other hand, an AV Management/Help-Desk Role for the cloud software would only require some level of understanding around AV device setup, network connectivity, and subnets.

That said, when using cloud-based software for AV/IT administration - materials, cooperation, and help are required from site IT staff.

The requirements from the IT staff vary based on the actions one wishes to manage via the cloud management software. IT support on the implementation of cloud management software can include:

However, ClearOne's CONVERGENCE Cloud AV Managerhas complex features that can bypass excessive IT oversight and can help enable a more straightforward setup. ClearOne CONVERGENCE Cloud AV Manager has simplified Local Agent setup, so no detailed IT skills are needed beyond onsite computer/server setup and software installation. Device-to-Cloud registration happens through the Local Agent server, making registering all ClearOne Pro Audio devices easy.

In addition, CONVERGENCE Cloud AV Manager allows an organization to use its own server for email notifications by default. However, if desired, an organization may use its own email server, in which case its settings would be required from the IT staff.

When it comes to security, Local Agents should be operated behind a network firewall to be secure. However, they require specific ports to be open on the Local Agent server's firewall to operate thoroughly. Through CONVERGENCE Cloud AV Manager, this task is now automated and somewhat customizable during installation.

Also, a strong understanding of network technologies such as FTP, TCP, UDP, and DHCP is typically required to set up and maintain a Cloud system. However, CONVERGENCE Cloud AV Manager does not require this understanding. Therefore, while some knowledge of these technologies can be helpful to better understand some security, network, AV device, and discovery options, it is not necessary when using CONVERGENCE software to integrate the cloud into AV infrastructure.

ClearOne's CONVERGENCE Cloud AV Manager is built for the integrator and supports a smooth and largely autonomous implementation.

The ClearOne CONVERGENCE Cloud AV Manager remote management software solution will be available for free until 2023. If you're interested in learning more about how this software can benefit your clients, click here.

See original here:
ClearOne : AV Practitioner's Guide to the Cloud - Part 3 of 3 - marketscreener.com

Read More..

3 ETFs to invest in cloud computing – Marketscreener.com

If you want to know more about the cloud computing market and its major players, I suggest you take a look at Tommy Douziech's excellent article where he details how the cloud works, the market and the future prospects of this technology, all accompanied by a complete thematic list of cloud players.

In short, cloud computing is a tool that allows you to benefit from an IT infrastructure, a development platform or even a ready-to-use service, all online, without the need for hardware at home or in your company. It runs on servers managed by the provider, who takes care of the security and maintenance of the hardware.

The cloud allows companies and users to operate infrastructure and services online, without additional hardware costs with a simple annual or monthly subscription. It is like renting a PC that is not in your office but on a server whose location is kept secret.

First Trust Cloud Computing ETF (SKYY):This ETF is provided by the Illinois based investment management company First Trust. With more than $5 bn in total net asset, SKYY is the largest cloud fund. It employs a modified equal weighted index limited to 80 companies, aiming to optimize the performance of the ISE CTA Cloud Computing Index. Companies listed in the index must respect these three criteria :

Since its inception in May 2011, the NAV average annualized total return is +17.39% (S&P500 average annualized total return over the period is +15.16%) and +10.55% over one year. The expense ratio is 0.60%.

Here are the top holdings as of 02/07/2022:

Global X Cloud Computing ETF (CLOU):CLOU is the second largest cloud computing ETF with $1.2 bn in assets under management. Provided by the well known New-York based ETF producer Global X, it is based on the Indxx Global Computing Index, which offers exposure to exchange-listed companies in developed and emerging markets that are positioned to benefit from the increased adoption of cloud computing technology. Since its launch in April 2019, the NAV average annualized total return is +23.97% and -3.26% over one year. The ETF is composed of 35 companies and total expense ratio is 0.68%.

Here are the top 10 holdings :

Visit link:
3 ETFs to invest in cloud computing - Marketscreener.com

Read More..

Oracle Cloud Had A Banner 2021 And Now Very Credible – Forbes

If you haven't noticed, I am have positive on Oracle's Cloud Infrastructure (OCI) and havewritten several articleson the topic. That was not always the case. I was very critical of Oracle Cloud V1.0.

OraclesGeneration 2 Cloudisan entirely new infrastructure developed from the ground up with no resemblance to its predecessor. The design goals werebetter performance, pricing, andabove all elsesecurity.Oracle Cloud V2 is a significant improvement, more competitive, and the reason I have had a change of heart.

A year ago, in aJanuary 2021 article, I wrote, "In my view, Oracle has come a long way since Cloud V1.0, and the product is better than the market perception right now. You heard it here first. Over the next 12 to 18 months, the story will keep getting told, and you will see more key customers choosing Oracle.

A year on,it is high time to take stock of that prediction and see where OCI stands in the marketplace.

Oracle

OCI by the numbers

Oracle reported stellarQ2 earningsdue tocontinued and increased global demand for the cloud business.

The increase in OCI consumption revenue was 86% in constant currency, with total cloud customer revenue up 45%.Those are impressive numbers compared to the largest IaaS and PaaS companies. I can only imagine the disruptive transition to aconsumption-based business. Customers only pay for cloud services when consumed. A simple pay-as-you-go concept, but it changes the entire sales cycle.

And, customer momentum appears to be broad-based. Enterprise and cloud native customers are running mission-critical workloads on OCI. Oracle's sample of those customers is Deutsche Bank, Pernod Ricard, Bayer, Aviva, Santander Bank, Telefonica Brasil, Quest Diagnostics, Bechtel, Carrefour, Liberty Global, Ingersoll Rand, and the National Stock Exchange of India.

Oracle is also making strides with cloud-native companies like Kaltura, SoundHound, and independent software vendors such as Telestream, Thomson Reuters, Kaltura,Crunch Mediaworks, Ericom Software, Fastly, and SCC Soft Computer.

Oracle also continues to expand its sports partnerships with teams like theGolden State WarriorsandRed Bull Racingto showcase how those organizations are using Oracle Cloud to gain a competitive advantage on and off the court and track. Its why Oracle partners not only with individual teams but, in certain cases, entire leaguessuch asPremier LeagueandSailGP.

The industry is taking notice too.Oracles score in a recent Gartner Scorecard reportjumped to 78%ahead of Google.

Why the continued momentum? Below I dig into the reasons why I think OCI is gaining momentum.

Equivalent performance and latency in the cloud

At the very least, you expect comparable I/O performance and latency. OCI featuresisolated network virtualization, essentially moving network and IO virtualization out of the server stack to network. The result is dedicated hosts with no hypervisor overhead, interference from other servers, or shared resources with a complete software-defined Layer 3 network topology. Off-box network virtualization enables bare-metal, VM's, containers, and databases to run on the same set of APIs with the cloud-native security and governance of a Layer 3 virtual network.

OCI addresses latency concerns witha flat non-blocking network based onClos network topology(named after Charles Clos). Predictable data rates reduce the number of routers and switches that data has to pass through.

OCImeets enterprise requirements for database clustering (including OracleReal Application Clusters(RAC),Exadata, VMware, compute clustering (RDMA), and tenant isolation.

Easing migration of enterprise workloads

Migrating enterprise workloads to the cloud is non-trivial. Oracle hasthe Cloud Liftprogramthat offerstechnical expertise and white-glove services to help customers move to the cloud at no additional cost.More than1,000 global companieshave already used the program.

Financial incentives are possible withOracle Support Rewards, which enables new OCI customers to reduce software license costs, even down to zero.

Oracle Support Rewards is a program where you earn $0.25 to $0.33 in rewards for every $1 you spend on OCI. Rewards are applied to reduce your technical software license support bill. The $0.33 bonus is applicable if you are anUnlimited License Agreement (ULA)customer.

If you use Oracle's on-premises software, you become eligible for Oracle Support Rewards when you place aUniversal Credit orderand begin consuming OCI.

Redefines the Telco cloud market

Oracle continues to introduce new solutions and recently announced Oracle Cloud for Telcos, a comprehensive set of cloud solutions built on OCI, that provides telcos with an exceptional customer experience and helps drive new opportunities for growth.The solution takes advantage of Oracles decades of experience working with global telcos to support their critical data systems, applications, and network operations.Oracle has already started with key partners likeTIM (Telecom Italia),Bharti Airtel, andTelefonica Espana.

Strong support for regions and hybrid clouds

I believe the customer experience is better when the cloud resources are closer. It is that simple. Additionally,business continuity and compliance requirements dictate that applications run across geographically separated locationsoften without having sensitive data leave the country. Oracle has invested in significantGlobal cloud region expansionacross 30 commercial and seven government cloud regions in 14 countries. Oracle opened sevennew cloud regions alone in the last three months -Israel,Marseille,Abu Dhabi,Milan,Stockholm,SingaporeandJohannesburg.

Customers can run thecomplete portfolio of public cloud servicesandOracle Fusion SaaSapplications on-premises withDedicated RegionandExadata Cloud@Customer. At the edge,Roving Edge Infrastructurewith ruggedized devices enables cloud computing at the edge of networks and disconnected locations.

Notably, Oracle also sees strong hybrid adoption from customers like Deutsche Bank, Volkswagen, and Marsh McLennan.

TheOracle and Microsoft Azure interconnectdelivers a mostly seamless and private interoperability for the many customers running multi-cloud environments.Oracle now has 10 interconnects around the world with the recent addition of South Korea and Phoenix.

Oracle joined forces with Cloudflarein the Bandwidth Alliance toeliminate unnecessary data transfer feesand ease the path to multi-cloud.

Continued innovation on the platform

OCI introduced hundreds of services and features over the past year. Here are just a few that caught my attention.

Oracle Cloud for Telcosis a comprehensive set of cloud solutions built on OCI. The OCI platform runs Oracle Fusion Cloud Applications Suite, Oracle Communications core network and B/OSS solutions, and more than 60 other industry application suites, as well as third-party and custom applications and workloads. It enables telcos to build new applications or modernize existing workloads with 80 plus cloud services, including data management, developer services, analytics, and artificial intelligence.

Oracle Cloud for Telcos enables any telecommunications provider to become more agile, reduce capital expenditures and operating costs, and establish a flexible foundation for innovation. Telcos can:

Applying artificial intelligence (AI) to applicationswithout requiring data science expertise is the goal of introducing anew set of AI servicesfor OCI. Prebuilt text recognition and anomaly detection models are now available to all OCI customers, once again at attractive prices. High query performance at scaleincluding provisioning, data loading, query execution, and failure handling is now possible with anew MySQL Database service with an in-memory query acceleration engine calledHeatwave.For the first time, MySQL customers have a unified platform for Online Transaction Processing (OLTP) and Online Analytics Processing (OLAP) to run mixed workloads or real-time analytics.

Oracle introduced a new service calledOracle Autonomous JSON Databasethat automates database provisioning, securing, scaling, and tuning to reduce the risk and cost of human error.Oracle Autonomous Database scored the highest in all four use cases in the2020 Gartner Critical Capabilities for Cloud Database Management Systems for Operational Use Cases.

I would be amiss not to mention security innovations in the public cloud. Oracle'sphilosophy has always been to build security into the core product. Security features such as encryption are default enabled. There is alist of security featuresto numerous to list here that come free with the product.

Oracle also provides additional capabilities likeSecurity Zonesso administrators can automatically set up and enforce security policies across cloud compartments within OCI.

Finally,in August 2021, Oracle also released theOracle Cloud Infrastructure (OCI) Cloud Adoption Frameworkto better support customers in their cloud adoption journey. The framework was created as a resource rich center serving anyone who wants to further their cloud knowledgefrom a tech operator to a CIOand as an opportunity to learn more about OCIs unique capabilities. Recently, Oracle enhanced delivering the framework by creating theOCI Cloud Adoption Framework technical site, which hosts relevant technical content, assets, and tools to better enable organizations shifting to the cloud. The site includes templates for cloud business strategy documents, reference architectures for specific scenarios,landing zonescripts to accelerate cloud deployments, and showcases the latestOCI Governance Model.

Wrapping up

I should probably quit my predictions while I am ahead, but clearly, Oracle is making many of the right moves in the cloud space, and customers are voting with dollars.

One area I will be closely looking at OCI in 2022 is its integration of Arm-based instances to lower its customers cost of compute. Cloud companies like AWS have fully embraced the technology and Oracle has to watch this carefully. I will also be looking at composable memory architectures in a future version of OCI where customers can add more memory on-demand without having to add a compute instance.

It will be interesting to check back in at the end of 2022 to review the OCI scorecard.

Note: Moor Insights & Strategy writers and editors may have contributed to this article.

Moor Insights & Strategy, like all research and tech industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, or speaking sponsorships. The company has had or currently has paid business relationships with 88,A10 Networks,Advanced Micro Devices, Amazon,Ambient Scientific,AnutaNetworks,Applied Micro,Apstra,Arm, Aruba Networks (now HPE), AT&T, AWS, A-10 Strategies,Bitfusion, Blaize, Box, Broadcom, Calix, Cisco Systems, Clear Software, Cloudera,Clumio, Cognitive Systems, CompuCom,CyberArk,Dell, Dell EMC, Dell Technologies, Diablo Technologies,Dialogue Group,Digital Optics,DreamiumLabs, Echelon, Ericsson, Extreme Networks, Flex, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud,Graphcore,Groq,Hiregenics,HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM,IonVR,Inseego, Infosys,Infiot,Intel, Interdigital, Jabil Circuit, Konica Minolta, Lattice Semiconductor, Lenovo,Linux Foundation,Luminar,MapBox, Marvell Technology,Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco),Mesophere, Microsoft, Mojo Networks, National Instruments, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek,Novumind, NVIDIA,Nutanix,Nuvia (now Qualcomm), ON Semiconductor, ONUG, OpenStack Foundation, Oracle, Panasas,Peraso, Pexip, Pixelworks, Plume Design, Poly (formerly Plantronics),Portworx, Pure Storage, Qualcomm, Rackspace, Rambus,RayvoltE-Bikes, Red Hat,Residio, Samsung Electronics, SAP, SAS, Scale Computing, Schneider Electric, Silver Peak (now Aruba-HPE), SONY Optical Storage,Springpath(now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, TE Connectivity,TensTorrent,TobiiTechnology, T-Mobile, Twitter, Unity Technologies, UiPath, Verizon Communications,Vidyo, VMware, Wave Computing,Wellsmith, Xilinx,Zayo,Zebra,Zededa, Zoho, andZscaler.Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is a personal investor in technology companiesdMYTechnology Group Inc. VI andDreamiumLabs.

Visit link:
Oracle Cloud Had A Banner 2021 And Now Very Credible - Forbes

Read More..

Cybercriminals Target Linux-based Systems With Ransomware and Cryptojacking Attacks – Yahoo Finance

VMware report finds more than half of Cobalt Strike users are using the tool illicitly

PALO ALTO, Calif., February 09, 2022--(BUSINESS WIRE)--As the most common cloud operating system, Linux is a core part of digital infrastructure and is quickly becoming an attackers ticket into a multi-cloud environment. Current malware countermeasures are mostly focused on addressing Windows-based threats, leaving many public and private cloud deployments vulnerable to attacks that target Linux-based workloads.

Today, VMware, Inc. (NYSE: VMW) released a threat report titled "Exposing Malware in Linux-based Multi-Cloud Environments."(1) Key findings that detail how cybercriminals are using malware to target Linux-based operating systems include:

Ransomware is evolving to target Linux host images used to spin workloads in virtualized environments;

89 percent of cryptojacking attacks use XMRig-related libraries; and

More than half of Cobalt Strike users may be cybercriminals, or at least using Cobalt Strike illicitly.

"Cybercriminals are dramatically expanding their scope and adding malware that targets Linux-based operating systems to their attack toolkit in order to maximize their impact with as little effort as possible," said Giovanni Vigna, senior director of threat intelligence at VMware. "Rather than infecting an endpoint and then navigating to a higher value target, cybercriminals have discovered that compromising a single server can deliver the massive payoff and access theyre looking for. Attackers view both public and private clouds as high-value targets due to the access they provide to critical infrastructure services and confidential data. Unfortunately, current malware countermeasures are mostly focused on addressing Windows-based threats, leaving many public and private cloud deployments vulnerable to attacks on Linux-based operating systems."

As malware targeting Linux-based operating systems increases in both volume and complexity amid a rapidly changing threat landscape, organizations must place a greater priority on threat detection. In this report, the VMware Threat Analysis Unit (TAU) analyzed the threats to Linux-based operating systems in multi-cloud environments: ransomware, cryptominers, and remote access tools.

Story continues

Ransomware Targets the Cloud to Inflict Maximum Damage

As one of the leading breach causes for organizations, a successful ransomware attack on a cloud environment can have devastating consequences.(2) Ransomware attacks against cloud deployments are targeted, and are often combined with data exfiltration, implementing a double-extortion scheme that improves the odds of success. A new development shows that ransomware is evolving to target Linux host images used to spin workloads in virtualized environments. Attackers are now looking for the most valuable assets in cloud environments to inflict the maximum amount of damage to the target. Examples include the Defray777 ransomware family, which encrypted host images on ESXi servers, and the DarkSide ransomware family, which crippled Colonial Pipelines networks and caused a nationwide gasoline shortage in the U.S.

Cryptojacking Attacks Use XMRig to Mine Monero

Cybercriminals looking for an instant monetary reward often target cryptocurrencies using one of two approaches. Cybercriminals either include wallet-stealing functionality in malware or they monetize stolen CPU cycles to successfully mine cryptocurrencies in an attack called cryptojacking. Most cryptojacking attacks focus on mining the Monero currency (or XMR) and VMware TAU discovered that 89 percent of cryptominers used XMRig-related libraries. For this reason, when XMRig-specific libraries and modules in Linux binaries are identified, it is likely evidence of malicious cryptomining behavior. VMware TAU also observed that defense evasion is the most commonly used technique by cryptominers. Unfortunately, because cryptojacking attacks do not completely disrupt the operations of cloud environments like ransomware, they are much more difficult to detect.

Cobalt Strike Is Attackers Remote Access Tool of Choice

In order to gain control and persist within an environment, attackers look to install an implant on a compromised system that gives them partial control of the machine. Malware, webshells, and Remote Access Tools (RATs) can all be implants used by attackers in a compromised system to allow for remote access. One of the primary implants used by attackers is Cobalt Strike, a commercial penetration testing and red team tool, and its recent variant of Linux-based Vermilion Strike. Since Cobalt Strike is such a ubiquitous threat on Windows, the expansion out to the Linux-based operating system demonstrates the desire of threat actors to use readily available tools that target as many platforms as possible.

VMware TAU discovered more than 14,000 active Cobalt Strike Team Servers on the Internet between February 2020 and November 2021. The total percentage of cracked and leaked Cobalt Strike customer IDs is 56 percent, meaning that more than half of Cobalt Strike users may be cybercriminals, or at least using Cobalt Strike illicitly. The fact that RATs like Cobalt Strike and Vermilion Strike have become a commodity tool for cybercriminals poses a significant threat to enterprises.

"Since we conducted our analysis, even more ransomware families were observed gravitating to malware targeting Linux-based systems, with the potential for additional attacks that could leverage the Log4j vulnerabilities," said Brian Baskin, manager of threat research at VMware. "The findings in this report can be used to better understand the nature of this malware and mitigate the growing threat that ransomware, cryptomining, and RATs have on multi-cloud environments. As attacks targeting the cloud continue to evolve, organizations should adopt a Zero Trust approach to embed security throughout their infrastructure and systematically address the threat vectors that make up their attack surface."

Download the full report here.

Methodology

The VMware Threat Analysis Unit (TAU) helps protect customers from cyberattacks through innovation and world-class research. TAU is composed of malware analysts, reverse engineers, threat hunters, data scientists, and intelligence analysts at VMware. To understand how to detect and prevent attacks that bypass traditional, file-centric, prevention strategies, TAU focuses on techniques that were once the domain of advanced hackers and are now moving downstream into the commodity attack market. The team leverages real-time big data, event streaming processing, static, dynamic and behavioral analytics, and machine learning.

TAU applied a composition of static and dynamic techniques to characterize various families of malware observed on Linux-based systems based on a curated dataset of metadata associated with Linux binaries. All the samples in this dataset are public and therefore they can be easily accessed using VirusTotal or various websites of major Linux distributions. TAU collected more than 11,000 benign samples from several Linux distributions, namely, Ubuntu, Debian, Mint, Fedora, CentOS, and Kali. TAU then collected a dataset of samples for two classes of threats, namely ransomware and cryptominers. Finally, TAU collected a dataset of malicious ELF binaries from VirusTotal that were used as a test malicious dataset. TAU started collecting the dataset in June 2021 and concluded in November 2021.

About VMware

VMware is a leading provider of multi-cloud services for all apps, enabling digital innovation with enterprise control. As a trusted foundation to accelerate innovation, VMware software gives businesses the flexibility and choice they need to build the future. Headquartered in Palo Alto, California, VMware is committed to building a better future through the companys 2030 Agenda. For more information, please visit http://www.vmware.com/company.

Sources & Citations

Exposing Malware in Linux-Based Multi-Cloud Environments, VMware, February 2022

Global Security Insights Report, VMware, June 2021

View source version on businesswire.com: https://www.businesswire.com/news/home/20220209005064/en/

Contacts

Kerry TuttleVMware Global Communicationsktuttle@vmware.com (470) 247-1987

Go here to see the original:
Cybercriminals Target Linux-based Systems With Ransomware and Cryptojacking Attacks - Yahoo Finance

Read More..

Azure offers free inward migration with Data Dynamics and Komprise Blocks and Files – Blocks and Files

Microsofts Azure public cloud is providing free inward data migration courtesy of deals with Data Dynamics and Komprise.

An Azure Storage Blog by Karl Rautenstrauch, Microsoft principal program manager for Storage Partners, says these deals with unstructured data management partners help you migrate your file data to Azure Storage at no cost! Its recommended for use by customers with 50TB or more of data to migrate. Users with less data can use tools such as AzCopy, rsync, or Azure Storage Explorer.

He adds: We intend this new program to help our customers and partners migrate from on-premises and non-Azure deployments of Windows or Linux File Servers, Network Attached Storage, and S3-compliant object stores to Azure Blob Storage, Azure Files, or Azure NetApp Files.

This program is a complement to the Azure Migrate portfolio which many Azure customers have used to automate and orchestrate the migration of servers, desktops, databases, web applications, and more to Azure.

Customers who take up the program will be given an onboarding session to learn how to use the software and will receive access to the support knowledgebase and email support for the chosen ISV and up to two support phone calls. We have also co-authored Getting Started Guides and our ISVs have created How-To videos to help you quickly begin your migration, writes Rautenstrauch. The program does not include professional services to help you configure the software beyond the onboarding session, Getting Started Guides and How-To videos.

Data Dynamics uses its StorageX product while Komprise supplies its Elastic Data Migration (EDM) offering. EDM was launched in March 2020 and takes NFS/SMB/CIFS file data and moves it across a network to a target NAS system, or via S3 to object storage systems or the public cloud.

Komprise says EDM eliminates cost and complexity in managing file data by providing analytics-driven data migration to Azure:

Customers can upgrade to the full product,Komprise Intelligent Data Management, which means they can transparently tier across Azure Storage platforms, cutting up to 70 per cent of cloud costs.

Find out more about the Azure and Komprise migration deal here. Lets see if Amazon Web Services and the Google Cloud Platform follow in Azures footsteps.

Go here to read the rest:
Azure offers free inward migration with Data Dynamics and Komprise Blocks and Files - Blocks and Files

Read More..

Future of Cloud Computing for Real Estate – The Times of India Blog

The real estate sector has transformed over the past decade with a younger crowd looking to the internet to make the first step towards buying a home or property. In order to stay competent and relevant, it is important for the real estate sector to adopt new technologies. Cloud technology is the right path for those looking to build an online presence with the help of marketing. Thanks to the pandemic, as per latest research a massive 80% of top real estate companies have switched their operations to Cloud in one form or another. It could be lead management, customer relationship management and more.

Heres how Cloud Computing can help the Real estate Sector:

1. Cost-effective

Real estate sector is depending on IT more day by day. But having an in-house infrastructure can be highly expensive since it will involve hiring resources, maintaining server and infrastructure costs. But migrating to a cloud can keep the costs to a minimum. The infrastructure used for cloud is that of the service provider, it offers pay-as-you-go model.

2. Remote Accessibility

Cloud enables files to be accessible from anywhere and on any device as long as you have a working internet connection. This will certainly help in the real estate sector when you are carrying out a client visit which will help showcase your recent work or upcoming projects as reference. All the necessary information will be at your fingertips. Cloud also enables staff to be more productive by overcoming any disruptions that can happen when physical meet is not possible during the pandemic. In such situations, having cloud accessibility makes it faster to communicate to clients in real time.

3. Increased Security

IT security is a major concern for most businesses. Implementing a robust security can be both expensive and challenging if you have limited budgets. Opting for cloud can help resolve this since cloud service providers utilize most advanced tools to protect data and deploy next-gen firewalls with intrusion and malware protection to keep cybercriminals away from servers and data.

4. Data Insights

Data is an integral part of decision making in providing insights that can help improve productivity, cost effectiveness, marketing, web design and more. Cloud technology helps analyze data and centrally manage data by providing insights needed to drive business forward.

It helps map customer journeys across all touchpoints ensuring that communications are consistent. The data also can help pinpoint poorly performing pages so that quick comebacks can be done such that customers stay on page for longer duration and this in turn helps gain advantage over competitors.

5. Superior Customer Experience

As per latest research, 86% of customers are willing to pay more if they are offered better customer experience. Migrating to the cloud provides an added advantage in providing excellent customer experience not only swiftly but cost effectively too. Using cloud helps businesses provide information to both buyers and sellers without having to visit the premises physically. To take cloud usage to the next level, some of the real estate agents even offer financial calculation tools, virtual tours, online form fills, customer service chatbots, omnichannel messaging etc. This helps buyers and sellers make an informed decision and in the long run foster strong relationships that lead to better reviews and online reputations.

6. Timely Upgrades

With cloud computing, the real estate sector does not have to worry about software upgrades.

The latest updates are generally taken care of by cloud service providers. This definitely leads to saving on investments that needs to be done towards maintaining servers, infrastructure that comes with on-premises servers.

7. Data Recovery

Data loss in the case of cloud is very minimal as cloud service providers store all the data in the cloud. In case of emergencies, there is no threat to data loss. SMEs generally cannot afford to make huge investments in terms of allocating budgets towards expensive disaster recovery systems. Cloud computing has definitely helped these businesses to thrive by providing cloud-based storage solutions that help safeguard their data.

8. Better Control of Data

Before the advent of cloud, employees had to send files back and forth before zeroing in on the final document. This would definitely lead to errors and reduced accuracy. With multiple versions of the file being tossed back and forth, it would be confusing. However, with cloud computing, multiple users can access the same file from different locations and suggest changes in real time leading to reduced error rate.

9. Enhanced Collaboration

Improving collaboration across teams and taking your business on international platforms, overcoming geographical barriers and time zones can be easily handled through cloud computing. Teams from different parts of the world can access the same document form anywhere and collaborate to perform better and in a more time-efficient manner.

10. Disaster Management and Mitigation

Cloud-based technologies are expected to only increase in disaster mitigation and management. The losses incurred by the real estate sector because of natural disasters or other calamities have been an area of concern for quite some time. With cloud-based technologies, disaster management can be handled better and more efficiently not only saving lives but minimizing the cost of repair and maintenance.

The rapid rate at which cloud technology is growing, it has become imperative that all sectors adopt the latest technology to meet the heightened expectations of consumers. Real Estate sector will need more agile environments to function better. To stay competitive and relevant, it is now more than essential to move to faster technology like cloud to achieve business continuity as well as meet customer expectations. Compared to 2020, the use of cloud computing increased by 41% in the real estate industry. The job opportunities are also booming because of the integration of cloud technology making way for new opportunities like Virtual Valuation Advisor, Virtual Leasing Manager which is of great help during the pandemic. Cloud is definitely the way to move forward for any sector, especially in the real-estate sector.

Views expressed above are the author's own.

END OF ARTICLE

See the rest here:
Future of Cloud Computing for Real Estate - The Times of India Blog

Read More..