Category Archives: Cloud Servers

Medelln Campus writes the future of worldwide industrial automation – Intelligent CIO ME

Rockwell Automation has launched an ambitious plan to accompany automation in manufacturing plants, regardless of their level of digital maturity. We look at the benefits it brings.

Industries are not stopping in the race for automation and need strategic partners with enough experience to guide them on that path. At the Rockwell Automation campus in Medelln, Colombia, they work on various fronts regarding the future of industrial automation and the software that supports it.

Csar Arango, Engineering Manager, Rockwell Automation, said: When I talk about the future, I refer to the Software-as-a-Service (SaaS) proposal in the cloud, that is, the possibility of hosting all of an industrys software in the cloud and not on local servers.

In addition to having deployed the necessary programs as well as benefiting from the associated saving costs, workers can access various aspects of the operation process without being in the plant.

SaaS offers advantages such as multi-location, immediate connection and collaboration between multiple locations. The suite that provides these services is FactoryTalk Hub, focused on software to support an ecosystem of industrial applications. And it is in line with what companies like Microsoft, Accenture, Salesforce and PTC, among others, have done.

88% of executives expect to optimize technology through services that are in the cloud, not by using their servers. Services is where we start to play, said Arango.

Rockwell Automation intends that in the next five years, IT and OT (Operational Technology) converge to take advantage of the significant advantages and computing capacity that the cloud offers and, thus, give better tools to the OT world.

This convergence will open many possibilities such as providing access to the software from anywhere, sharing information internally and with clients, enabling on-demand scaling, and as a result, generating cost reduction and performance improvement.

The products they have been developing allow the client to view the status of a plant in real-time or have analysis based on Big Data to identify deviations of any of the devices.

Arango points out that if a conveyor belt in a factory stops, the entire plant must stop. And this may be since no one perceived the vibration level was changing slightly for a few days. It is something that could be foreseen. This solution aims to optimize costs and generate fewer failures in the plants.

FactoryTalk in three stages

In FactoryTalk Hub, there are three main chapters: Design, operation and maintenance. For its development, Rockwell Automation has made a strategic alliance with Microsoft. In the Design Hub, the tools allow you to migrate from previous versions of drivers to the most up-to-date ones to develop the capacity of emulation and design through the cloud.

For example, if you are looking to design a cookie production plant, you can simulate the entire process in the cloud through software before creating it in the physical world.

The Medellin team engages in the Design Hub with file type conversion tools, Logix controller update, project analysis and digital engineering, emulation tools, digital twins and many other emerging technologies.

The Operation Hub, Plex Systems (recently acquired by Rockwell Automation) is a leader in cloud-native innovative manufacturing solutions operating in 2,400 plants in 37 countries and processing around 8.5 billion transactions per day. Through software with the same name, it allows managing and automating production. To connect on-premises or edge devices to the cloud requires IoT technology.

As the Maintenance Hub, Rockwell Automation, with its subsidiary Fiix, provides a cloud-native computerized maintenance management system (CMMS) powered by AI.

Then there is the FactoryTalk Vault, launched a year ago at the Automation Fair. This application drives the management and control of industrial automation programs, allowing companies to optimize their backup costs. It combines with design tools to keep their control software versions in the cloud updated.

About Rockwell Automation

Rockwell Automation is a provider of industrial automation and Digital Transformation solutions. Rockwell Automation sectors include intelligent devices, software and control, and lifecycle services. The company operates its business approximately 100 countries worldwide, including United States, China, Canada, Italy, Mexico, the United Kingdom, Germany and Australia.

Facebook Twitter LinkedInEmailWhatsApp

See original here:
Medelln Campus writes the future of worldwide industrial automation - Intelligent CIO ME

How Kubernetes lowers costs and automates IT department work – The Register

Advertorial One of the key factors to consider when evaluating an IT solution is concerned with how fast updates are brought to the market. Releasing an application is not enough. You need to work on it every day, add new features and services and simultaneously keep it running. Yet you can't just turn off the app, update it, and turn it on again. Your online store should be up and running while the guys wearing shabby knit sweaters are deploying your latest updates.

To make sure that the update implementation process remains unnoticed by users, you need to resort to special microservices, containers and the orchestration infrastructure. This is the definition of Kubernetes. With this solution, manual management of different versions, subversions, and parts of an application becomes unnecessary. This makes the system more powerful, reliable, stable, and expandable. Overall, it becomes easier to use.

How Kubernetes helps businesses:

Businesses profit from using Kubernetes as it helps them automate their work. Using Kubernetes significantly reduces the amount of money spent on hardware and human resources. It allows the project team to focus on their main task of website development rather than website administration. Here's how it works:

Kubernetes uses a virtual machine by G-Core Labs combined with a DevOps approach, thus helping businesses automate routine tasks. In this case, the application is launched and works at any stage in the same way as if it were launched and would work on the developer's local host.

"The economic, organizational and social consequences of the pandemic will continue stimulating digital innovations and cloud services", - believes Henrique Cecci, Senior Research Director at Gartner. This consulting company expects end users to spend more than $480 billion on public cloud services next year.

Public clouds simplify the work with Kubernetes significantly as they use modern infrastructure solutions such as API. This synergetic system makes it possible to distribute the workload within the cloud efficiently, thus enhancing the profit you get from your IT investments.

Here is how it works: imagine that you run a service with users in ten countries working with two clouds, with the main cloud being located in America and the backup - in Europe. In the past this was enough, but as new legal requirements have been introduced to one of the markets, you now need to store user data on the territory of the respective country.

In this case, you will most likely address one of the cloud providers - for example, G-Core Labs. As a result, you will get access to a virtual machine featuring a powerful CDN and other resources, which will allow you to deploy and manage containers most efficiently using Kubernetes. Thanks to a content delivery network with over 140 points of presence in 100 cities around the world, your servers with all the users' personal data will be located on the territory required by law.

Usually, businesses connect with a provider because they have to, yet it brings a positive effect in the end. After migration, the service clients in the desired region download files 22.5 times faster, while storage and download fees make up about the same sum that you previously paid for storage alone.

Kubernetes allows businesses to reduce their infrastructure costs and helps companies get the most of their IT investments. Migrating to a public cloud also brings businesses further bonuses. For example, in G-Core Labs, outbound traffic and the configuration of cluster nodes through the cloud control panel or via API are free of charge. You pay only for virtual machines, disks and load balancers.

Therefore, you will pay due attention to the four main sectors allowing you to save money - the cloud, the cluster, the main tools, and the company's culture. At the same time, you will also be able to reduce the amount of money spent on Kubernetes itself, while taking the full advantage of using this technology.

Automatic scaling provided by Kubernetes results in high availability and maximum application performance, which are both important for businesses. Now, when you need a new container for some service, you contact the provider and connect the new server to the cluster. Kubernetes automates this process. It uses an API request to order a virtual machine from a cloud provider, connects it to the cluster, and adds the required pod (container) with the required parameters.

This platform turns out to be very helpful in many other cases as well. Let's imagine that you've launched an application in Kubernetes and that its containers are already receiving some traffic. When the CPU load increases, the platform will notice this and will automatically increase the number of machines used in order to distribute the requests properly.

Using special metrics and tests allows the system to quickly identify damaged or unresponsive containers. Failed containers are created anew and get restarted on the same pod. This allows programmers to focus on development instead of doing routine administrative tasks.

Kubernetes allows developers to create production-like environments for automated testing. General application logs and Kubernetes app logs will help you detect problems and errors even faster.

Imagine that you've decided to completely redesign your cybersports video streaming app. The new layouts have already been internally tested by the team and have been sent to focus groups for trial. Everything seems to be fine. In your own cloud, everything works well. But what is it going to look like on production? To answer this question, you can resort to the so-called canary testing which implies a partial release of a certain service. While the overall check is still in progress, small amounts of live traffic are sent to the released application parts. The results are tracked and compared with the ideal, allowing you to make decisions concerning the app launch.

Such "traffic injections" remain unnoticed by the users because the containers are duplicated, and the users get redirected from one container to another. For orchestration purposes, you can use Kubernetes provided by G-Core Labs. The provider's virtual machines work with high-performance servers that have Intel Xeon Scalable processors (Ice Lake) of the 3rd generation. In April 2021, G-Core Labs became one of the world's first companies that started integrating such processors into their infrastructure.

Businesses should consider migrating to the Kubernetes platform in quite several cases:

Migrating to Kubernetes is necessary for companies that need to maintain their information systems online 24/7. This is exactly why using Kubernetes together with the cloud technologies offered by G-Core Labs is the ideal solution.

Sponsored by G-Core Labs

The rest is here:
How Kubernetes lowers costs and automates IT department work - The Register

3 Top Trends to Invest in for 2022 (and Beyond) – Motley Fool

The last two years haven't been easy or predictable for investors, but 2022 will present its own challenges. Uncertainty about interest rate increases and the new omicron coronavirus variant have triggered some volatility in the stock market recently, and it could carry into the new year.

But long-term strategies tend to negate short-term noise, and with the recent dip in some technology stocks, 2022 might be a great time to buy with a multi-year focus. To pick your stocks, it might be a good idea to focus on broad trends in high-growth industries.

Three Motley Fool contributors have identified Microsoft (NASDAQ:MSFT), Snowflake (NYSE:SNOW), and Upstart Holdings (NASDAQ:UPST), because together, they operate in industries that will represent trillions of dollars of economic growth throughout this current decade.

Image Source: Getty Images

Anthony Di Pizio (Microsoft):Cloud computing, in the simplest of terms, is the business of accessing data and programs online using the internet, rather than having them installed on computers or devices locally. In an era of remote work, and with companies operating in dozens of different countries, the cloud makes conducting everyday operations so much easier because it effectively connects organizations together internally -- no matter the location.

The popularity of this technology is evident in the numbers. By 2026, the cloud computing market is estimated to more than double to $947 billion in annual spend. The cloud services industry is dominated by a small handful of tech behemoths, and one of them is Microsoft. The average consumer probably associates the company with its Windows computer operating system, or its Office 365 software -- and why wouldn't they? These products serve billions of people globally.

But of Microsoft's three main business segments, cloud computing is in fact its largest, accounting for over 37% of total revenue in the recent fiscal first quarter of 2022. Cloud is also growing significantly faster than Microsoft's overall revenue.

Metric

Fiscal Q1 2021

Fiscal Q1 2022

Growth

Total revenue

$37.1 billion

$45.3 billion

22%

Cloud revenue

$12.9 billion

$16.9 billion

31%

Data source: Microsoft

This trend has been apparent for quite some time at Microsoft. From fiscal 2019 to fiscal 2021, cloud revenue grew at a compound annual rate of 24% compared to 15% for overall revenue.

Microsoft's cloud business is driven by its Azure platform, which provides over 200 different products and services, some of which rely on incredibly advanced technologies like artificial intelligence and machine learning. These can be used to analyze speech and images, and even make predictions using data. But Azure also caters to high-demand services like application development, security, and the Internet of Things.

I think Microsoft is one of the best stocks to buy for exposure to the cloud. It's not just because the segment is growing so quickly, but also because investors are buying a suite of other incredible businesses. Aside from the software offerings mentioned earlier, the company owns Xbox and Surface, which are multi-billion dollar hardware brands in their own right.

This diversity could make Microsoft the ultimate play in an uncertain 2022, and beyond.

Image source: Getty Images.

Jamie Louko (Snowflake): Companies have been producing an increasing amount of data over the past few years: 90% of the world's data has been created in just the past two years, and the amount of data that is being made today is expected to double in another two years. This rapid increase in data will result in companies needing more capabilities to analyze and process their growing amounts of data, and Snowflake is allowing them to do so.

The company offers businesses the ability to freely bring and store their data on Snowflake servers, and the companies only pay when they want to query and access their data. With businesses receiving increasing amounts of data every day, Snowflake is an easy choice because it doesn't charge to store the data. This has resulted in rapid adoption: Snowflake's third-quarter customer count grew 52% year over year to 5,416 customers.

This feature of Snowflake's business is what attracts customers, but the analytics is where Snowflake will thrive. With more data, companies will have to analyze their data more often, leading to increased interaction with Snowflake. The company has already seen success with this business model. The number of customers spending over $1 million with Snowflake increased 128% year over year to 148 customers in Q3. Additionally, customers who spent $100 one year ago are spending on average $173 today.

The company sees an addressable market of $90 billion ahead of it today, which is why it is heavily investing back into the business. Snowflake spent over $306 million in sales and marketing expenses and research and development, which resulted in immense unprofitability in Q3. The company lost $155 million -- representing roughly 46% of Q3 revenue. While this net loss is bad today, as Snowflake's expenses pay off by gaining market share and developing new products, the company will be able to scale back its expenses as growth continues.

Here's the bottom line: Snowflake's business model makes it easy for customers to join the platform, and the fast-growing data analytics market will undoubtedly grow rapidly through 2022. With these two tailwinds pushing Snowflake forward, it is well positioned to flourish in 2022 and beyond.

Image source: Getty Images.

Trevor Jennewine (Upstart): Artificial intelligence will likely be one of the most transformative technologies ever conceived of by the human race. It has the potential to improve efficiency and productivity across virtually every industry, and it will likely create tremendous wealth in the process. In fact, McKinsey & Company forecasts that AI will add $13 trillion to global economic output by 2030.

On that note, Upstart is a great example of a company using artificial intelligence to solve real-world problems. Specifically, its platform aims to make loans more accessible for consumers and less risky for lenders. Whereas traditional credit models consider between eight and 30 variablesto determine who qualifies for credit, Upstart captures over 1,600 data points per borrower, and measures those data points against repayment events. That means Upstart's AI models get smarter each time someone makes or misses a payment.

More importantly, Upstart's decisioning technology considers more data, which theoretically allows it to quantify risk more precisely. In fact, internal studies have shown that Upstart's AI models can reduce a bank's default rate by 75% while holding the approval rate constant. Alternatively, its platform can boost the approval rate by 173% while holding the loss rate constant.

Not surprisingly, Upstart has seen strong demand from banking partners. Over the past year, its client base tripled, and the company made significant headway in the auto lending industry. As a result, transaction volume surged 244%to $3.1 billion in the third quarter, and revenue soared 250% to $228 million. Even more impressive, despite being a young fintech company, Upstart is profitable according to generally accepted accounting principles (GAAP).

Going forward, I think Upstart can maintain that momentum. Over the last 12 months, its technology powered $8.9 billion in loans. But management puts its total addressable market at $753 billion, and that figure could get even bigger if Upstart expands into new industries -- for instance, mortgage loan originations total $4.5 trillion each year.

More importantly, Upstart's AI models appear to give the company a significant advantage. Assuming that holds up in the years ahead, I think shareholders could see 10x returns over the next decade.

This article represents the opinion of the writer, who may disagree with the official recommendation position of a Motley Fool premium advisory service. Were motley! Questioning an investing thesis -- even one of our own -- helps us all think critically about investing and make decisions that help us become smarter, happier, and richer.

Originally posted here:
3 Top Trends to Invest in for 2022 (and Beyond) - Motley Fool

What Agencies Need to Do to Combat Shadow IT Driven by Cloud Sprawl – Nextgov

Migrating to the cloud offers federal agencies huge advantages in performance and flexibility. Government services cant effectively scale or adopt new capabilities like big data analytics, artificial intelligence, machine learning and internet of things without migrating to the cloud. But government cloud adoption has empowered an old IT nemesis: shadow IT.

Shadow IT is the use of IT systems, devices, software, apps and services outside the supervision of an organizations approved IT systems. In the past, shadow IT was typically a business unit creating their own locally developed applications, or LDAs, because the office of the chief information officer engagement was judged too onerous. During my time in public service, I saw personnel surreptitiously use Microsoft Access to address an urgent data processing need that inadvertently turned into a mission-critical mission system. This was only discovered when Microsoft Access reached its scaling limits and then turned into an emergency project to transform it into a web-based application.

Building LDAs is even easier when using cloud services. This opportunity for shadow IT is exacerbated by government mandates to move to the cloud prior to the development of a governance structure that can monitor and manage such a move. Combine all this with the very human tendency of development teams to experiment with creating cloud resources and not clean up after themselves, and the result is more shadow IT and cloud sprawl.

Cloud sprawl is inefficient use of the cloud: over-provisioned, over-scheduled, underutilized or orphaned cloud assets. It often happens when development teams spin up new cloud resources, forget about them, then move on to the next urgent task. Even when cloud servers are terminated, the servers storage volumesin a sense virtual hard drivesare often left behind. This creates orphaned cloud resources.

Teams also size cloud resources too large based upon the legacy technical specifications coming from on-prem data centers, instead of starting small and using cloud elasticity for auto-scaling. This results in over-provisioned and underutilized resources. This cloud sprawl increases costs and often leads to overruns in government program budgets.

Cloud sprawl and the related lack of governance can also make agencies more vulnerable to data breaches. When development teams create cloud resources, they may not fully understand the impact of its related configurations, as was the case in the 2019 Capital One data breach that enabled access to sensitive records stored in Amazon Web Services S3 buckets. To mitigate the risk introduced by misconfigured cloud resources, agencies need to define cloud usage standards and implement ways to monitor compliance to those standards.

Effective implementation of AIOps is the answer to modern-day shadow IT and cloud sprawl. Heres the Gartner definition: AIOps combines big data and machine learning to automate IT operations processes, including event correlation, anomaly detection and causality determination.

One cloud-centric AIOps solution is robotic cloud automation, or RCA, a suite of AIOps capabilities that establishes governance guardrails and enforces usage standards across multiple cloud environments. For critical standards compliance issues, it can also remediate the non-compliance findings by bringing cloud resources back into the desired state configuration. This delivers significant cost savings and security improvements through automated monitoring, reporting and remediation of compliance issues.

For all enterprise cloud hosting teams, the first step to regaining control is to define your standards. When agencies are considering which standards to establish, they should embrace established industry standards. RCA is aligned with some of the most widely respected standards in the industry, including Center for Internet Security Benchmarks, NIST 800-53 and AWS Foundational Security Best Practices. These provide baseline standards to start from, including hundreds of configuration guidelines to safeguard cloud environments against todays evolving cyber threats.

As mentioned above, for many agencies the genie is already out of the bottle. Cloud adoption preceded a management structure, and teams have already created the cloud sprawl and violated security best practices. In such cases, RCA deployment follows a predictable iterative implementation pattern by first enabling monitoring and reporting to understand the depth and breadth of the compliance challenges. Then agencies need to drive effective communication and change management strategy that engages the cloud users, to adopt the new cloud standards and iteratively drive improved compliance.

Once fully compliant with a standard, RCA can enable automated remediation, which locks-in future compliance by maintaining the desired state configuration of cloud resources in perpetuity. For example, for every new server spun up in the cloud, RCA evaluates compliance to three core configurations: proper tagging, encryption and standardized security group usage. If the server fails any of these tests it is automatically terminated. Cloud sprawl is nipped in the bud. Its truly governance as code.

RCA is a powerful enforcement tool for any CIO managing a multitenant cloud environment. Yet critically, its not enforcement in the old, top-down model of the past. RCA provides AIOps that enable teams to own more of the security responsibility because a cloud hygiene baseline is baked into the system. Agencies can save millions by embracing AIOps, shutting down existing cloud sprawl, and preventing it from happening again in the future.

Gone are the days when one central IT team could support 20, 40, 100 separate development groups. It simply isnt possible due to the complexity of cloud service offerings, even if government agencies had the budget and the talent pool to attempt it.

I do understand the lingering appeal of the do it ourselves approach. I remember 10 years ago wondering if government could truly trust the big cloud service providers to support agency infrastructure and mission. That question has been definitively answered: yes. The cloud provides incredible capabilities to agencies we couldnt imagine a decade ago. For example, the CSPs have perfected automated database failover in their managed database products that enable reliable and consistent failover in minutes.

Long gone are the days of engineering database synchronization and manual failovers. Now RCA enables AIOps for government to eliminate shadow IT, cloud sprawl and securely explore the potential of the cloud.

Aaron Kilinski is co-owner and chief technology officerof Simple Technology Solutions.

Read the original post:
What Agencies Need to Do to Combat Shadow IT Driven by Cloud Sprawl - Nextgov

Nvidia CEO Huang jointly files patent for software tech in the metaverse – The Register

Nvidia's CEO Jensen Huang continues inventing, as if his role in the rise of GPUs wasn't enough.

A patent application published on December 2 credits Huang as one of the inventors of a system to open and share a file in the cloud without the need for a corresponding application on local devices.

Instead, the opened file is encoded and presented through a video stream, with everything happening in the cloud. To be clear, the application is a continuation of filings and patents granted dating back to 2012 related to graphics processing in the cloud and network-attached GPUs. The new patent hasn't been granted yet.

Names of company CEOs are often attached to patents as it adds legitimacy to an invention and makes it easier to defend in court. Steve Jobs' was named in over 300 patents, for example.

The patent application, called "titled method" and apparatus for execution of applications in a cloud system," was filed in August this year and published this month.

The patent, if granted, could be a key cog in Huang's vision to move computing into the metaverse, specifically in engineering and scientific computing. It's like opening a CAD/CAM file via a cloud application and presenting it to users via a headset or mobile device, who can then manipulating the file within the video stream.

Headsets can already do two-way communication with cloud servers, but Nvidia is proposing a novel technique. The patent involves a cloud server receiving a file identifier from a device, pulling that file from a server, finding the relevant application in the memory of a cloud server, and then "executing the application as a video stream destined for the client device."

The patent filing seems relevant for applications in which engineers collaborate in real-time on the design of machines and equipment via the metaverse.

The patent document takes a hack at the disadvantages of desktop engineering software, which typically require powerful computers. Many devices also still are also created via a modular approach, with engineers creating different blocks of an overall design, and patching them together.

Nvidia has shown many videos of engineers or scientists collaborating in the cloud through video streams.

CAD/CAM software already use GPUs for simulation and design. The metaverse may be great for simulation, but pushing engineers into a cartoon interface to collaborate on design may be a challenge. Either way, Nvidia's doing all it can to sell more chips.

The rest is here:
Nvidia CEO Huang jointly files patent for software tech in the metaverse - The Register

Truly thrifty cloud hosting – Hetzner Online GmbH

Hetzner Cloud products in Nuremberg and Falkenstein run on servers in our own Hetzner data center parks in Germany.

A video-monitored, high-security perimeter surrounding the entire data center park and access control systems guaranty the highest security level.

We operate all of our data centers in accordance with strict European data protection regulations. Our data center parks in Nuremberg and Falkenstein are in the middle of Europe, giving our customers quick connections to Western, Central, Southern, and Eastern Europe.

Have a look at the Hetzner Online Data Center Park Falkenstein.

Hetzner Cloud products in Helsinki run on servers in our own Hetzner data center park in Tuusula, Finland. Tuusula is just a 30-minute drive from central Helsinki.

A video-monitored, high-security perimeter surrounding the entire data center park and access control systems guaranty the highest security level.

We operate all of our data centers in accordance with strict European data protection regulations. Our data center park in Helsinki is in the north of Europe, and is a great addition to our customers quick connections to Western, Central, Southern, and Eastern Europe.

Hetzner Cloud products in Ashburn run on our own Hetzner servers in the data center park of a third party in the USA. The location is about a 45-minute drive from the US capital Washington, D.C. and is the first Hetzner Cloud location outside of Europe.

Data security is our top priority. For this reason, we are the only ones in control over the usage of our Hetzner servers in the data center in Ashburn.

We host our cloud instances in our own data centers in Germany and in Finland. In Ashburn, Virginia (USA), we also provide AMD-based cloud servers and cloud features. The Ashburn location is in a region nicknamed "Data Center Alley", which is one of the most highly trafficked regions in North America, making our proximity to it ideal.

Visit link:
Truly thrifty cloud hosting - Hetzner Online GmbH

These researchers wanted to test cloud security. They were shocked by what they found – ZDNet

Insecure cloud-computing services can be a huge risk for organisations because they're a regular target for cyber criminals. Researchers have demonstrated howvulnerable or misconfigured cloud servicescan be, after deploying hundreds of honeypots designed to look like insecure infrastructure, some of which lasted just minutes before being compromised by hackers.

Cybersecurity researchers at Palo Alto Networksset up a honeypot compromised of 320 nodes around the world, made up of multiple misconfigured instances of common cloud services, including remote desktop protocol (RDP), secure shell protocol (SSH), server message block (SMB) and Postgres databases.

The honeypot also includedaccounts configured to have default or weak passwords-- exactly the sort of things that cyber criminals are looking for when trying to breach networks.

SEE:Cloud security in 2021: A business guide to essential tools and best practices

And it wasn't long before cyber criminals discovered the honeypot and looked to exploit it -- some of the sites were compromised in minutes while 80% of the 320 honeypots were compromised within 24 hours. All of them had been compromised within a week.

The most attacked application was secure shell, which is a network communication protocol that enables two machines to communicate. Each SSH honeypot was compromised 26 times a day on average. The most attacked honeypot was compromised a total of 169 times in just a single day.

Meanwhile, one attacker compromised 96% of the 80 Postgres honeypots within a single 90-second period.

"The speed of vulnerability management is usually measured in days or months. The fact that attackers could find and compromise our honeypots in minutes was shocking. This research demonstrates the risk of insecurely exposed services," said Jay Chen, principal cloud security researcher at Palo Alto Networks.

Exposed or poorly configured cloud services like those deployed in the honeypot make tempting targets for cyber criminals of all kinds.

Several notoriousransomwareoperations are known toexploit exposed cloud servicesto gain initial access to the victim's network in order to eventually encrypt as much as possible and demand a multi-million dollar ransom in exchange for the decryption key.

Meanwhile,nation state-backed hacking groups are also known to target vulnerabilities in cloud servicesas stealthy means of entering networks in order to conduct espionage, steal data, or deploy malware without detection.

SEE:A winning strategy for cybersecurity(ZDNet special report)

And as the research demonstrates, it doesn't take long for cyber criminals to find exposed internet-facing systems.

"When a vulnerable service is exposed to the internet, opportunistic attackers can find and attack it in just a few minutes. As most of these internet-facing services are connected to some other cloud workloads, any breached service can potentially lead to the compromise of the entire cloud environment," said Chen.

When it comes to securing accounts used to access cloud services, organisations should avoid using default passwords and users should be provided withmulti-factor authenticationto create an extra barrier to prevent leaked credentials being exploited.

It's also vital for organisations to apply security patches when they're available in order to prevent cyber criminals from taking advantage of known exploits --and it's a strategy that applies to cloud applications, too.

"The outcome [of the research] reiterates the importance of mitigating and patching security issues quickly. When a misconfigured or vulnerable service is exposed to the internet, it takes attackers just a few minutes to discover and compromise the service. There is no margin of error when it comes to the timing of security fixes," said Chen.

See more here:
These researchers wanted to test cloud security. They were shocked by what they found - ZDNet

What Is The Cloud And Where Is It Used? – Fossbytes

Did you know that the storage space of the worlds first computer was just 225 kilobytes? While we take the copious amounts of storage space on our computers and smartphones for granted these days, it is important to be curious about the Cloud, which is the future for storage devices, applications, and public and private sectors. In this article, lets look at what the Cloud is and how its enabling us to shape a better future.

Before you take it in a literal sense, no, the Cloud storage is not actually stored in huge masses of water droplets suspended in our atmosphere. The Cloud works because your data is stored on servers across continents, and you can access that data using the Internet. The Cloud has many advantages over a traditional data storage infrastructure. Accessibility, Savings, and Scalability are three of the most fundamental features of the Cloud.

Having physical servers and maintaining them is challenging, and thats when the Cloud helps immensely. Its more cost-effective since you wont need to buy hardware, and its also good for the environment and savings as the electricity required to run the servers is also saved. Not to mention, it also saves a lot of time.

Its highly scalable, meaning if your firm needs more storage space, youre only a few clicks away from upgrading the storage, which is again better than having to upgrade physical servers, which is largely time-consuming and involves a lot of capital as well as manpower. Its also accessible from anywhere.

Bonus info: Imagine that youve finally struck a great deal with an organization thatd require you to share some of your companies files with them and vice versa. Cloud makes collaboration much easier and more efficient.

Last and one of the most crucial things, your applications or data on the Cloud will be accessible 247 throughout the year. If you have an application running on the Cloud, the app can be distributed across multiple servers around the globe. Hence, if one of the servers goes non-functional, the other server will act as a backup.

Some of the famous firms that provide Cloud storage and services are Amazon (AWS), Microsoft (Azure), and Google (Google Cloud Platform). Of course, there are many other companies, but these are by far the most popular ones.

Theres a lot more to the Cloud than what meets the eye, but these are the things you need to know to get started with the same. Got any questions or suggestions? Drop them in the comments section below.

See the original post here:
What Is The Cloud And Where Is It Used? - Fossbytes

JetBrains starts adding remote dev functionality on IDEs and introduces Fleet – ZDNet

Image: JetBrains

JetBrains has begun separating the front and back ends of its IDEs to allow developers to have the interface on one machine, but have the source code, toolchain, and IDE backend on another.

Using the new JetBrains Gateway IDE launcher, the connection to the remote machine is via SSH and currently only supports Linux physical and virtual machines as servers.

"The JetBrains Client runs locally and provides the user interface for the IDE backend. It's based on the IntelliJ Platform and feels just like a full IntelliJ-based IDE -- it has the same editor, code completion, navigation, inspections, and refactoring tools as a local IDE, but all of the files are hosted remotely and all of the language processing is done on the remote server," the company said in a blog post.

"Remote development is a great way to make use of powerful cloud-based servers, create reproducible, clean development environments, and avoid the nightmare of losing a laptop full of important source code."

Other restrictions on remote development include only being available on IntelliJ IDEA Ultimate, and not the free community edition, as well as users plugins needing to be installed both locally and remotely.

"We are working on the ability to install plugins remotely from JetBrains Client," the company said.

Gateway is bundled with IntelliJ IDEA Ultimate, PyCharm Professional, GoLand, PhpStorm, and RubyMine, and is able to be used standalone with CLion and WebStorm.

At the same time, JetBrains dipped its toe into the world of lightweight editors with a limited preview of Fleet, which it said has been "built from scratch with a new architecture".

Opening as a text editor, once its smart mode is enabled, Fleet will connect to an IntelliJ IDEA or Language Server Protocol-based backend depending on the language, to provide functionality such as refactoring, highlighting, completion, and type information. Fleet can also be used for collaborative development with multiple clients able to connect to the same backend.

Fleet with smart mode enabled.

Fleet currently supports Java, Kotlin, Python, Go, JavaScript, Rust, TypeScript, and JSON, with PHP, C++, C#, and HTML slated to arrive soon.

Additonally, JetBrains has also launched Docker development environments in its Space tool that run on JetBrains servers.

"Space lets you prepare the backend for work, cloning the Git repo, building project indexes, and resolving dependencies for you," JetBrains said.

"It will seem as if someone has come to the office an hour before you, turned your computer on, and opened the project in the IDE and prepared everything for you. So you can get your day off to a great start and begin working in a 100% ready IDE."

If users do not use an environment for 30 minutes, the container is automatically shut down, with unsaved changes being saved. The Containers are currently only able to support one repository.

Virtual machines are currently offered in 4, 8, and 16 core configurations with 8, 16, and 32GB of memory respectively, with pricing set at $0.40, $0.80, and $1.60 per hour, and storage of the environment charged at $0.008 per hour.

The developer environments can be accessed with an IDE supported by Gateway, or with Fleet.

Link:
JetBrains starts adding remote dev functionality on IDEs and introduces Fleet - ZDNet

Your iPhones best trick is tucked away inside Photos app do you know it?… – The Sun

APPLE has built in a genius feature into your iPhone Photos app.

You can find any photo in your camera roll in seconds, if you know how.

2

2

It's the Search function, which uses powerful computer image analysis to work out what's going on in your photos.

So you can search for specific terms like "lake", "dog" or "cheese".

And just as if you were searching on Google Images, you'll be able to find relevant photos in your own reel.

If you've had an iPhone for a while, you likely have thousands or even tens of thousands of photos.

So searching through these manually can be an absolute nightmare.

Instead, just try searching for a key term in the app to narrow it down.

Latest

Apple has built dedicated machine learning into the Photos app to recognise scenes and objects.

You can even search for specific events, like a concert that you went to.

"Photos for iOS can use the time and location of your photos along with online event listings to find matching photos," Apple explained.

"It's also possible to search by location.

So you can enter a place name like London to track photos of your trip to the UK capital.

First, make sure you're updated to the latest version of iOS.

To do this, go into Settings > General > Software Update.

Then open the Photos app it's like flower petals made from the colours of the rainbow.

Tap the big Search button in the bottom right-hand corner of the app the one with the magnifying glass logo.

Then at the top, use the search bar to type in something you're looking for.

You can search for a type of object or scene, a place or time, or even a person's name (if they're assigned in your Photos ap).

And don't start to panic that Apple is snooping on your photos.

All of the processing to make this work happens on your iPhone.

So your image data isn't being sent up to Apple's cloud servers to be analysed for the search function.

"When you search your photos, all of the face recognition and scene and object detection are done completely on your device," said Apple.

"Apple harnesses machine learning to enhance your experience and your privacy.

"We've used it to enable image and scene recognition in Photos, and more, without requiring your data to leave your device."

Best Phone and Gadget tips and hacks

Looking for tips and hacks for your phone? Want to find those secret features within social media apps? We have you covered...

In other news, Google Chrome users have beenurged to delete their browser.

Facebookrecently rebranded to Meta.

Check out thebest iPhone 13 dealsin October 2021.

And take a look at yourhidden Facebook rejection folder.

We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at tech@the-sun.co.uk

See the rest here:
Your iPhones best trick is tucked away inside Photos app do you know it?... - The Sun