Page 3,003«..1020..3,0023,0033,0043,005..3,0103,020..»

What Does the Future With Cloud Computing Look Like – Analytics Insight

Cloud computing is a significant part of digital transformation and businesses count on cloud services for agility and efficiency.

The Covid-19 pandemic transformed businesses and the economy itself. We saw a spike in digital transformation during the pandemic since it shut down offices, laid social distancing norms, introduced remote working, and demanded agility and resilience. Many industries embraced digital transformation and automation to enhance business growth to retain their space in the economy. Going digital became the new normal and cloud services had a crucial role to play in this transformation.

Cloud was not new and it had established its significance in the pre-pandemic days by offering cost-efficient services, scalability, and flexibility to businesses. Amazon AWS defines cloud computing as the on-demand delivery of IT resources over the internet with pay-as-you-go pricing.

Cloud services can be used in software development, customer-facing web applications, data storage, and processing, etc. The beneficiaries of cloud computing range from business giants to common people who use Google Drive or Apple iCloud.

The post-pandemic world would depend on the cloud more than it used to since the rapid digital evolution last year has changed the working environment and lifestyle. According to Forrester predictions, the global public cloud infrastructure market will grow 35% to 120 billion dollars in 2021. Many of us would think if the cloud is worth such hype, so let us look at some of its benefits and decide how important is the cloud in the coming years.

Flexibility and low cost are the prime factors that aid the reign of cloud computing in the coming years. Cloud offers 247 services with maximum scalability. Businesses can adopt cloud services on-demand and at any point of their journey. Cloud services offer a certain amount of freedom to their clients wherein it is possible to address fluctuating bandwidth without complicated IT updations. Cloud enables faster scaling up or down of IT infrastructure. The flexibility of the cloud is vital in the near future that expects agile businesses.

Cloud services can be implemented with minimum costs and maintenance. Since the pandemic resulted in a cash deficit, many industries look for cost-efficient options to transform their business. Cloud is a pay-as-you-go-service and hence, you do not have to worry about paying for what you dont want.

Cloud services offer great accessibility that enables people to operate the cloud from their own spaces. The remote working ecosystem is going to stay for long and with cloud services, there are no geographical limitations. Anybody can access it from anywhere and this provides companies better connectivity among employees even during a crisis. Cloud is lauded for its mobility that enables anybody to access the services from their smartphones or personal laptops. Cloud does not need a set of infrastructure or machines. We can literally carry it with us and maybe thats why they are called the cloud.

Hybrid cloud is another developing feature, which will benefit industries soon. Hybrid cloud systems increase the flexibility of cloud services and provide better options since they can be accessed from various vendors. Dependability on third-party service providers will reduce as the hybrid cloud allows companies to customize their cloud end according to their specific requirements. Hybrid clouds can reduce IT development costs in the long-term.

Today, all organizations are concerned about data security. Increased digitization might also increase data vulnerability. Last year there were several reports of cyber breaches and hacking since industries were in crisis. The coming years will demand the best safety practices to protect data. Cloud can be considered one of the best data backup space which is secure enough to keep hackers away. Data recovery becomes easier with cloud computing in place.

Cloud computing offers a wide range of benefits to industries in the post-pandemic recovery stage, which demands agility and resilience. The integration of cloud computing with AI, big data, and IoT can be leveraged to attain new heights of innovation. Cloud services can easily take over on-premise infrastructure and increase business efficiency. Business enterprises have already taken their roads to digital transformation and cloud computing will underpin them on their way to success.

Follow this link:
What Does the Future With Cloud Computing Look Like - Analytics Insight

Read More..

How cloud-based AI is set to transform mobile apps – Cloud Tech

The ongoing growth of cellular wireless network applications creates new demand for emerging technologies. Fifth-generation (5G) wireless communication includes several network layers that leverage technology such as Open Radio Access Network (Open RAN), network slicing, and cloud edge computing.

Within cloud-edges, the broader artificial intelligence (AI) industry is witnessing a migration of AI to the edge. According to the latest worldwide market study by ABI Research, the edge AI training and inference market for chipsets is expected to grow from $2.6 billion in 2020 to $10.7 billion in 2025, at a CAGR of 35 percent.

Furthermore, new enterprise use cases place new performance, agility, and latency requirements on the network. These, along with the ongoing quest to drive new growth, are compelling the industry to shed human-intensive networks in favor of an intelligence-driven ecosystem.

According to the ABI assessment, telecom service providers are already actively expanding the utilisation of AI and machine learning (ML) beyond merely digitising internal and external interactions.

Many communication service providers (CSPs) are already on a journey to become augmented service providers where AI augments human decision making for prediction, analysis, and new revenues, said Don Alusha, senior analyst at ABI Research.

Rakuten, for example, has renamed its network operations centres (NOCs) to service experience centres (SECs) as it implements extreme automation for self-aware networks. In addition, Telefnica Tech is a new venture to incubate new growth based on AI/ML, cloud and IoT/big data.

AI/ML capabilities enable the industry to leverage IT-oriented nimbleness and scale as they seek to manage the complexities of todays networks and establish new commercial models, Alusha adds.

New commercial models will need to complement existing asset-intensive environments where an understanding of the cost of goods sold, inventory turns, managing factories, and supply chains is key to success. In the new world of cloud, AI/ML, and software, technology vendors do not manufacture a product and sell it.

They sell a capability. They sell knowledge. They create it at the same time they deliver it. The business model is different and so are the economics. DriveNets, Enea Openwave, Ericsson, HPE, and Nokia are some vendors among many others that are building software-centric ways of marketing and selling solutions. The point is that AI/ML-based platforms are re-shaping existing commercial models. The winners will be those who act decisively and thoughtfully, Alusha says.

For CSPs, the continued maturity of AI/ML will be a key enabler of new value creation in their journey to become a digital service provider. Technology is a key pillar of that journey, but there are other key dimensions, that if not considered part of the overall digitalisation journey, may limit CSPs ability to capture the full value at stake.

Specifically, change management is critical and constitutes the bulk of the effort as CSPs embrace new ways of working. Equally important is to embrace openness and break the siloes, two sides of the same coin. CSPs that are investing in AI/ML-based platforms must consider that efficiency will come from sharing knowledge and embracing open platforms where APIs and data can be easily accessed.

AI/ML, big data, and open APIs offer agility and the ability to drive innovation. Consequently, the new world in cellular must start with a foundation on software and API-led connectivity. The ability to harness the power of software-defined networking platforms and AI/ML capabilities are the future.

This may well mean that, in addition to bolting on software and intelligent capabilities, CSPs need to learn how to build them as cloud-edges, Open RAN, and 5G core proliferates in the ecosystem, Alusha concludes.

This is an emerging market where mobile network operators can differentiate their network services with value-added offerings. I believe that open innovation methodologies will enable service providers to engage their chosen technology partners that become part of evolving and expansive 5G ecosystems.

Interested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend theCyber Security & Cloud Expo World Serieswith upcoming events in Silicon Valley, London and Amsterdam to learn more.

See more here:
How cloud-based AI is set to transform mobile apps - Cloud Tech

Read More..

Can Cloud Computing Lead the Way on Climate? – Data Center Frontier

Cloud computing's growing role in the energy markets positions the data center industry to drive a global shift to renewably-powered business. New on the DCF Show.

When it comes to climate change, it is time for the data center sector to lead. The massive energy footprint of cloud computing enables the data center industry to drive a global shift to renewably-powered business. This is becoming a huge issue for cloud platforms, data center developers and service providers and especially for users of data center services.

This is the second in a series of broadcasts outlining the Eight Trends That Will Shape the Data Center in 2021. The data center industry is in a unique position to accelerate the adoption of sustainable practices and reduce the damage to our changing climate.Customers and stakeholders are demanding accountability on climate impact, creating a compelling business incentive to embrace sustainability.

Listen to todays show:

Here are links to the additional resources on enterprise IT trends that I mentioned in the podcast:

These article series provide excerpts of some of the findings in our recent reports:

Be sure to check our our free Annual Forecast, outlining the DCF take on the most important trends to watch this year:

Did you like this episode? Be sure to subscribe to the Data Center Frontier show so you get future episodes on your app. We are onApple Podcasts, Spotify or wherever you find quality podcasts. You can also subscribe using ourRSS feed, and tell your friends and colleagues about our podcast!

Weve also just launched theData Center Frontier Channel on YouTube to showcase our videos, where we explain the infrastructure that powers the digital world. If you like data centers or work in the industry, youll want to subscribe!

Read the rest here:
Can Cloud Computing Lead the Way on Climate? - Data Center Frontier

Read More..

Google Cloud boss is focused on growth: ‘We need scale in order to be profitable’ – CNBC

Thomas Kurian, chief executive officer of cloud services at Google LLC, speaks during the Google Cloud Next '19 event in San Francisco, California, U.S., on Tuesday, April 9, 2019. The conference brings together industry experts to discuss the future of cloud computing.

Michael Short | Bloomberg | Getty Images

Google cloud chief Thomas Kurian said Wednesday that he's focused on growth for now, with an eye toward profitability later on.

"We're working closely with Alphabet CFO Ruth Porat to improve our operating margin and operating loss, which will both benefit from increased scale over time," Kurian told Heath Terry, a Goldman Sachs managing director, at the virtual Goldman Sachs Technology and Internet Conference. "Scale will bring material improvements in profitability. And we're very focused on that. We know in this business we need scale in order to be profitable."

Investors are looking for reassurance that Google's its years-long investment in cloud computing can contribute to rather than detract from the profitability of Google parent Alphabet. Growth could also make Google more competitive with market leader Amazon, which derives almost half of its income from its cloud division.

Kurian's comments come a week after Google parent Alphabet for the first time disclosed the operating results of the cloud unit, which includes the Google Cloud Platform public cloud infrastructure and Google Workspace productivity software subscriptions. The unit has lost at least $4 billion in each of the last three years, and finished 2020 with a loss of $5.61 billion on $13.06 billion in revenue. Revenue growth slowed slightly from 53% in 2019 to 47% in 2020.

The Google Cloud organization remains focused on accelerating revenue growth, said Kurian, who replaced Diane Greene as head of Google's cloud group two years ago after spending two decades at Oracle.

Kurian said Google is keen to launch cloud infrastructure in more countries in order to satisfy organizations' regulatory needs. He pointed to uptake in Indonesia and South Korea. At the same time, Google Cloud continues to invest in expanding its sales function. Kurian said the business is on track to expand its direct sales organization by more than three times

"We really believe we have a good line of sight on how as you scale investments," he said. "That opens up markets and opens up revenue."

Nominations are open for the 2021 CNBC Disruptor 50, a list of private start-ups using breakthrough technology to become the next generation of great public companies. Submit by Friday, Feb. 12, at 3 pm EST.

WATCH: Google Cloud will be very profitable business: Investor

Read more from the original source:
Google Cloud boss is focused on growth: 'We need scale in order to be profitable' - CNBC

Read More..

Why cybersecurity isn’t cybersecure without detection and recovery – Cloud Tech

Anyone responsible for data security who doesnt get a shiver down their spine when they read the about yet another high-profile ransomware attack in the news is either doing something very right, or something very wrong.

The danger of falling victim to a cybersecurity issue is getting greater as the volume of attacks continues to rise and bad actors become increasingly sophisticated.Interpolhas highlighted how Covid-19 affected both the number and nature of cyberattacks during 2020, and notes: Vulnerabilities related to working from home and the potential for increased financial benefit will see cybercriminals continue to ramp up their activities and develop more advanced and sophisticated modi operandi.

The natural reaction to such worrying news is to seek protection and build the walls, and there are plenty of firms out there whose livelihood depends on providing just that. The best of them do a grand job, and their regular threat reports indicate just how many attacks they defeat.

But lets not kid ourselves. No organisation can ever ensure 100% protection from an attack. Especially when those attack types are changing faster than most firms update their defences. Data often sits in too many locations, some forgotten by the user, and ultimately too many areas like this are likely outside those protected by upfront protection, scanning services and threat intelligence. Even some approaches to data backup and restore systems can be somewhat haphazard, augmented over time as new systems are added, consequentially with complex backup routines and even some outdated scripts that are no longer fit for purpose.

How many organisations can say, with absolute certainty, that there are no data silos or duplicate systems outside of the main protected area but with accessibility to inside the network? How many organisations can provide absolute assurance that there are no backups, live or archived, that might not be completely clean of infection and are reliable?

If 100% protection is not possible, what is an organisation to do to protect itself? We would not for a moment advocate giving up on using a protection service. As a first line of defence it is absolutely necessary, but multiple lines of defence are needed for robust and reliable security. The trickier you can make it for an attacker, the less likely they are to succeed. One of the first lines of defence aside from the upfront protection and firewalls must be threat detection. For you to know there is a problem, perhaps before it materialises into a full-blown extorition attempt, and with some hope of restoration and kicking out an attacker, is invaluable.

Sadly too many organisations fail to recognise this and are punished. Consider the malware attack thats discovered because an unwitting employee has an issue, needs a restore, only for the IT team to find, hours or maybe even days later, depending on how the restore has been set up that the ransomware has reinstalled itself, because it had planted itself quietly and neatly in the backup where it has sat, undetected, just waiting for a restore to reinject itself back into the business.

None of this is idle speculation.Look at any sector and there are examples of very serious outages from the past year in the UK alone.

Early in 2020Redcar and Cleveland Councilsuffered a long-lasting outage due to what it reported as a ransomware attack. The attack started on February 8, and it took a month for services to be up and running again. The cost of getting over the problem has been put by the council at over 10 million. In October 2020Hackney Councilwas the victim of a cyberattack, and even weeks later it had still not been able to bring all the data back online. The cost of getting over this attack is, as we write, still unknown as recovery is ongoing.

Of course, for nearly any recovery strategy, the data is only as current as the last backup taken. Every organisation has differing needs, but each must weigh up a variety of factors to determine how frequently to backup, including the cost of downtime and the resources needed to bring business back online. Depending on your business size, the team you have to dedicate to recovery, the nature of the business, the regulations you operate in, and of course budget and critical operations, it will differ.

However, for a bank, they could not only lose business, and therefore money, but if the backup data used to recover is even just a few hours old, they are in trouble. However a small retailer selling plants could get by with weekly backups. Its all relative and the only people capable of assessing the criticality of backup and recovery for your business is you and your team. What is a niggle for some businesses is frontpage news and a CEO firing for another.

But what we can be pretty certain of is that an organisation cant just park its data in backup and hope for the best.

Through a robust, reliable backup and restore setup, with strong malware detection capabilities, organisations have a genuine chance to protect themselves, and get back up and running, malware free, in less than an hour. However, without the combination of a front line of defence protecting against cyberattacks and a reliable set of measures for recovery when the front line inevitably fails, no organisation has an appropriate level of protection and recovery. Now, as we head into the unknown of 2021, how does your business stand up to attack?

Read more: 83% of enterprises transformed their cybersecurity in 2020 accelerated by Covid cloud migration

Photo by mostafa meraji on Unsplash

Interested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend theCyber Security & Cloud Expo World Serieswith upcoming events in Silicon Valley, London and Amsterdam to learn more.

Tags: backup, Cloud Security, cybersecurity, Data Backup, Disaster Recovery, ransomware, Vulnerabilities

View original post here:
Why cybersecurity isn't cybersecure without detection and recovery - Cloud Tech

Read More..

Jane Gilson, CEO, CloudSphere: On taking the top job and the huge enterprise multi-cloud opportunity – Cloud Tech

CloudSphere, a cloud governance provider based primarily in the United States and Ireland, has only been going for six months. Yet the company has a unique approach and a big mission and believes it has found the right person to lead it.

Jane Gilson, formerly of Google and Microsoft the latter where she helped lead cloud transformation and build data centre strategy was appointed at the end of last month. CloudSphere said Gilsons extensive international experience background with software as a service models, and her understanding of cloud customer needs will help the company scale to the next level.

CloudSphere is a combined entity of HyperGrid, a provider of cloud management and governance, and iQuate, a company focused on agentless discovery and application mapping. The latter is key to the companys strategy. By focusing on an application-centric vision of cloud governance, the proposed solution offers granularity in areas such as migration planning, cloud cost management, and cloud security posture management.

The company therefore sits squarely as a middleman, offering monitoring, compliance or consulting arguably all three for the complex hyperscaler clouds. Demand for this type of business particularly with the rise of multi-cloud workloads continues to rise. MarketsandMarkets pegged CAGR of the cloud monitoring market at 19.2% between 2016 and 2022.

For Gilson, these complementary factors all helped inform her decision to join. The enterprise adoption of multi-cloud architectures is a once-in-a-generation transformation of the IT landscape, she tells CloudTech. CloudSphere is uniquely positioned to address the resulting cloud governance and management challenges.

Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform comprise almost two thirds of the global cloud infrastructure services market. Synergy Research pegged it at the beginning of this month at 32%, 20% and 9% respectively. CloudSphere has partnerships with all three, and Gilson is looking to bolster this going forward.

The hyperscalers are constantly adding new capabilities on their platforms, and we need to work closely with them to make sure we understand their plans and how we can help our customers take advantage of those new capabilities, says Gilson. That takes a consistent dialogue and a commitment to the partnership.

We also see the hyperscalers as a route to market, Gilson adds. Our cloud governance platform is available on both the AWS and Azure marketplaces because our customers have made it clear that they like the simplicity and economics of buying software through their cloud provider.

Another factor informing CloudSpheres growth acceleration is, understandably, Covid-19. As companies bolster their cloud migrations, it has a knock-on effect for other areas. According to Centrify research, as this publication reported in November, 73% of enterprises polled had accelerated their migration plans to support remote working. 81% sped up their IT modernisation processes.

This relates to Gilsons view of enterprise adoption of multi-cloud as once-in-a-generation. Cloud adoption has been forced through a 5-10 year acceleration in the last 12 months, she says. As cloud adoption accelerates amid the Covid-19 pandemic, IDC forecasts worldwide spending on cloud services will surpass $1 trillion in 2024.

Multi-cloud strategies are now the norm, and with 81% of organisations using two or more cloud providers already, securing and governing multi-cloud environments is a top IT challenge facing enterprises, Gilson adds.

The opportunity is clear: organisations desperately need simple, comprehensive security and governance amid the complexity.

Gilson believes CloudSphere has the right approach and tech differentiation to become a worldwide leader in this space. This approach will be defined by her leadership. What is the best piece of advice she has been given in her career and why? Own your seat at the table, take that job you dont feel fully ready to do, and build and nurture your network, says Gilson. It will pay dividends.

The journey CloudSphere and Gilson will take from here is certainly worth keeping an eye on.

Interested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend theCyber Security & Cloud Expo World Serieswith upcoming events in Silicon Valley, London and Amsterdam to learn more.

Tags: AWS, azure, cloud governance, cloudsphere, google cloud, hyperscalers

Go here to see the original:
Jane Gilson, CEO, CloudSphere: On taking the top job and the huge enterprise multi-cloud opportunity - Cloud Tech

Read More..

Learn the essentials of cloud computing with this stacked bundle – AOL

Learn the essentials of cloud computing with this stacked bundle

TL;DR: The 2021 Cloud Computing Architect Certification Bundle is on sale for 21.69 as of Feb. 13, saving you 98% on list price.

If youre interested in learning the essentials of cloud computing and if you've got about 20 and 21 hours to spare you can get started today.

With the help of this cloud computing training led by Idan Gabrieli, youll become a savant in solving latency issues and improving overall tech processes for companies. The training covers everything from cloud computing basics to microservices to machine learning and is designed especially for beginners in the field. Gabrieli (B.Sc. and MBA) is an entrepreneur, cloud and AI expert, and presales manager with a vast technical background.

SEE ALSO: The fastest VPNs for browsing, streaming, and shopping securely

Throughout the nine courses in this collection, youll explore the evolution of cloud technology, discover the five characteristics of cloud computing, discuss different cloud service models, and learn about cloud deployment models. From there, you can either dive into a three-part series on cloud computing with Microsoft Azure or a three-part series on machine learning. Theres also a beginners guide to microservices architecture and a deep dive course on the IaaS model of cloud computing.

If none of this makes any sense to you yet, its totally fine. Gabrieli does a great job of taking you through the content step by step, starting with the absolute basics and working your way up.

While this cloud computing training is valued at 1,300, you can sign up for a limited time for just 21.69.

Save 98% on the 2021 Cloud Computing Architect Certification Bundle See Details

Go here to see the original:
Learn the essentials of cloud computing with this stacked bundle - AOL

Read More..

Microsoft, HPE Bringing AI, Edge, Cloud to Earth Orbit in Preparation for Mars Missions – HPCwire

The International Space Station will soon get a delivery of powerful AI, edge and cloud computing tools from HPE and Microsoft Azure to expand technology experiments aimed at preparing NASA for launching future crewed exploratory missions to Mars.

The new equipment and software, including HPEs specialized, second-generation Spaceborne Computer-2 (SBC-2), will mark the first time that broad AI and edge computing capabilities will be available to researchers on the space station, Tom Keane, Microsofts vice president of Azure Global, wrote in a Feb. 11 post on the Azure blog.

The new hardware, software and services are scheduled for launch to the ISS at 12:36 p.m. on Feb. 20 aboard Northrop Grummans 15th(NG-15) Commercial Resupply Services cargo mission. The NG-15 missions launch from the Wallops Flight Facility at Wallops Island, Virginia, is contracted by NASA to bring needed supplies.

The new SBC-2 computer thats heading to the space station follows the original Spaceborne Computer-1 that was sent to the ISS in 2017 as part of a validation study to test it in the rigors of space aboard the orbiting laboratory. SBC-1 returned to earth in 2019 after completing its mission. Both Spaceborne Computer-1 and Spaceborne Computer-2 are sponsored by the ISS National Lab.

SBC-2 will bring ISS researchers a wide range of new capabilities they did not have with the original Spaceborne machine from 2017 to 2019, Dr. Mark Fernandez, solution architect for converged edge systems at HPE and principal investigator for SBC-2 told HPCwire sister site EnterpriseAI. Technological advancements in AI, cloud and more will provide more possibilities in the new machine for ISS researchers, he said.

Hardware-wise, were sending up the HPE Edgeline Converged EL4000 Edge system, which is purpose-engineered and built to operate on the edge and take advantage of AI and ML capabilities with its onboard Nvidia T4 GPUs, said Fernandez. These are enterprise-class, commercial off-the-shelf servers that go into data centers.

Featuring CPUs and GPUs

The Edgeline EL4000 servers will use Nvidia T4 GPUs for AI and machine learning, image processing, video processing and other tasks. Previously, the first SBC-1 used CPUs for those tasks. The latest SBC-2 will include CPUs and GPUs to allow for comparison performance experimentation in space.

The 1U boxes insert into standard data center 19-inch racks on the ISS. The racks are then inserted into lockers aboard the ISS to hold them securely. Also provided are an enterprise-class compute node, HPEs ProLiant DL360, for intense compute requirements, said Fernandez.

For the second generation of the SBC, NASA asked HPE to send up twice the compute power of the original version, said Fernandez. So, were sending up twice the number of servers. Youll see two lockers and each contains two servers.

One is a CPU-based Intel server for those that love Intel and traditional computing, and well have a GPU-based Edgeline server for those that are doing image processing, artificial intelligence, machine learning, etc., said Fernandez.

NASA asked for double the computing power in the SBC-2 so the agency can continue its work toward sending humans to Mars, he said. SBC-1 was a proof-of-concept device for 18 months and now the new SBC-2 will be tested to see how it reacts to two to three years in space to accommodate a mission to Mars, he added.

Azure in Space

The Azure cloud capabilities will be used with the machines to allow experiments with getting data back and forth from the ISS to Earth as quickly and efficiently as possible, said Fernandez. Such data transfers are done today using existing NASA technologies.

The ISS is only 220 miles up in Earth orbit, but the networking is circa-1980, said Fernandez. We have speeds of two megabits a second up and down to the ISS. I have 50 megabits a second in my home.

Increasing those speeds will be critical for Mars missions, he said.

Microsoft is enabling that, and they have aspirational plans to come up with some AI and machine learning that well look at, said Fernandez. One idea they will look at is running data on SBC-2 and then sending small amounts of data back to Earth, and then comparing that to bursting data to Azure and seeing what works faster.

Were sitting right on top of the same NASA network, but were going to encode and compress messages back and forth in order to take the most advantage of that two megabytes per second, he said. I have a brilliant scientist who is going to run the same experiment on CPUs, on GPUs and in the cloud. And he will report back to the community, if you have this type of data, its best if you process it this way because we are given those three options.

The experiments will begin after the equipment arrives at the ISS and following their installation and setup. Those tasks are expected to take some time to complete, including several days for the cargo mission to arrive at the space station. Weve got three pre-canned experiments for three different users that were going to hope to fire off right away, said Fernandez.

How Azure Views Its Mission Aboard the ISS

The crux of this work is about making the capabilities of Azure available toastronauts, space explorers andresearchers to learn and advance science and the use of the cloud to support their goals, a Microsoft spokesperson told EnterpriseAI. Through this project we will be able to continue to gain knowledge onhow wecanbest support thescience andresearch community, wherever they are, on and off the planet.

With SBC-2, Microsofts research and Azure space engineering teams are evaluating the potential of HPEs [space-based] state-of-the-art processing in conjunction with hyperscale Azure, alongside the development of advanced artificial intelligence (AI) and machine learning models to support new insights and research advancements, the spokesperson said.

That includes weather-modeling of dust storms to enable future modeling for Mars missions, plant and hydroponics analysis to support food growth and life sciences in space, and medical imaging experiments using an ultrasound on the ISS to support astronaut healthcare.Also being created is a platform for the development and testing of hybrid edge-cloud environments before contributing additional experiments to the ISS.

We are exploring the potential of empoweringexciting newexperiments thatleveragethe far-reaching potential of the cloudin conjunction with theHPE Edge capabilities, the spokesperson said. To date, researchershavehad tooften limitthe scope of theirstudytowhat computationalresources theyhadavailable to conduct their research.

Using bursting capabilities with Azure will add to future capabilities, according to Microsoft. Bursting down to the cloud provides access to more computation/resources than can be hosted in the ISS, while leveraging SBC-2s power and proximityat the edge, the spokesperson said. We are excitedto empower others, even in space,tobeable to leveragethe power ofMicrosoftAzuremaking it possible forastronauts, space explorers, andresearchers to thinkbig astheytackle theirtoughestquestions.

See the rest here:
Microsoft, HPE Bringing AI, Edge, Cloud to Earth Orbit in Preparation for Mars Missions - HPCwire

Read More..

Businesses need to think hybrid when going for cloud computing – Gulf News

Businesses are making the shift to a cloud... but is that enough? Hybrid ways are the best way to go about it. Image Credit: Supplied

Building a cloud journey has become a top priority for businesses to help them adopt innovative technologies, revolutionize business models, and subsequently impact the bottom-line.

However, challenges like management, security, and governance of data and workloads distributed on multiple clouds have become significant barriers in defining a robust and cohesive cloud journey. A hybrid cloud can resolve most of these challenges

The hybrid model allows businesses to manage multiple clouds explicitly designated to meet current and incremental requirements, data, and workloads in a secured and governed manner, backed by a flexible architecture.

Such a landscape may include the combination of one or more on-premise infrastructures, internally managed or outsourced private clouds, public clouds from multiple providers, and even infrastructure for legacy and most modern IoT and Edge systemsall running simultaneously to fuel the digitization needs of the enterprise across functions.

In banking, for example, instantaneously processing millions of transactions and delivering a perfect customer experience throughout the journey is pivotal. However, securing the customer's data at every step is essential for banks to maintain trust and integrity.

A hybrid approach helps banks define modern architectures to cater to security, governance, speed, ease of management on-demand with agility in a business-strategy-first model.

In an IBM study conducted by IDC in Middle East, Turkey and Africa, 85 per cent said they are pursuing or looking to pursue a hybrid cloud strategy in their organization. More than 55 per cent suggested the key reason to adopt a hybrid cloud was flexibility and significant cost savings.

Lets explore some of the other tangible benefits.

Simplify movement between clouds

Enterprise workload and capability requirements, cost implications, security or regulatory pressures, an infrastructure lift-and-shift risk exposure, and other factors can influence the need for a sudden full or partial cloud-to-cloud migration. Mission-critical workloads span extensive IT estates that include traditional data centres and clouds in different locations, each with unique government and regulatory requirements.

In parallel, the immense focus on artificial intelligence has mandated businesses to collect, aggregate, and analyze data for building actionable insights to innovate business models at scale. On average, a business today draws data from over 400 sources to build advanced analytics for decision-making.

As the volume of data grows, the maintenance implications and underlying cost grows too, and can often slow down the entire transformation.

Build once, deploy anywhere

Building a hybrid approach can offer greater flexibility to tackle these obstacles by providing an open, faster, and more secure way to move core business applications to any cloud flexibly. With hybrid solutions, organizations easily move and deploy containerized workloads consistently on any infrastructure.

This can also reduce expected downtime from weeks to hours. Beyond migration, such solutions also help with automation capabilities, improve employee productivity, and deliver better end-to-end customer journeys while reducing the burden of governing content and processes.

At Telecom Egypt, we implemented a hybrid cloud solution, infusing AI for more flexibility and scalability of operations. They can now manage and automate their networks, while identifying, isolating and resolving problems before they impact operations. All of which is powered by real-time, historical analytics.

This will also enable new digital services with ease.

Openness and security

Not relying on a single vendor increases the flexibility of choosing the right cloud for your business requirements without fitting needs to vendors specifications. A hybrid approach also lets you meet certain local or industry-specific regulatory requirements by separating workloads or data into locations as needed but running your systems in a centralized fashion.

Our work with appsNmobile Solutions in Ghana has seen the collaboration create a fast, stable infrastructure that builds in security for payment transactions in a challenging environment.

Todays organizations want choice, and they can have it by choosing the appropriate cloud for each of their requirements. They seek open innovation - minus vendor lock-in - and the ability to build once and deploy anywhere.

Choosing a hybrid cloud that provides the agility, security, mobility, integration, and cost efficiencies will play a key role in driving business success in the post-pandemic era.

-Mostafa Zafer isVice-President at IBM Data & AI, Automation and Security, Middle East and Africa.

Follow this link:
Businesses need to think hybrid when going for cloud computing - Gulf News

Read More..

Volkswagen taps Microsofts cloud to develop self-driving software – The Indian Express

Volkswagen AG on Thursday said it will use Microsoft Corps cloud computing services to help it streamline its software development efforts for self-driving cars. Volkswagen, which owns brands such as Audi and Porsche, is working on both self-driving cars for the future and driver-assistance features such as adaptive cruise control in current vehicles. But the companys brand had been developing those features independently.

Last year, Volkswagen consolidated some of those development efforts into a subsidiary called Car. Software to better coordinate among the makers, with each company handling its own work around the look and feel of the software while collaborating on core safety functions such as detecting obstacles.

But the various companies inside the group were still using different systems to develop that software, and the deal announced Thursday will put them on a common cloud provider, Dirk Hilgenberg, chief executive of Car. Software, told Reuters in an interview.

The Microsoft deal will also make deploying software updates to add new features to cars a practice that helped set Tesla Inc apart from many rivals early on much easier. Volkswagen in 2018 inked a deal with Microsoft to connect its cars to Microsofts Azure cloud computing service.

The Thursday deal means that the software updates will be developed on the same cloud that will then beam those updates down to the cars.Over-the-air updates are paramount, Hilgenberg said. This functionality needs to be there.

If you cant do it, you will lose ground.In practical terms, the deal means that cars that initially hit the road with a few driver-assistance features today could add new capabilities over time that bring them closer to autonomous driving, said Scott Guthrie, executive vice president of cloud and artificial intelligence at Microsoft.For our phones 15 or 20 years ago, when you bought it, it pretty much never changed. Now, we expected every week or every couple of days that, silently, theres new features, Guthrie told Reuters in an interview. That ability to start to program the vehicle in richer and richer ways, and in a safe way, transforms how the experience works.

See the original post:
Volkswagen taps Microsofts cloud to develop self-driving software - The Indian Express

Read More..