Category Archives: Cloud Computing

Exclusive Interview with Yancey Spruill, CEO of DigitalOcean – Analytics Insight

Cloud computing was popular and well-established pre-pandemic as well but it has taken the spotlight as a move to remote working was thrust upon us. And it looks like its here to stay for a long time. Cloud technologies are moving away from a linear evolution to prepare for an exponential evolution. DigitalOcean simplifies cloud computing so developers and businesses can spend more time building software that changes the world. Analytics Insight has engaged in an exclusive interview with Yancey Spruill, CEO of DigitalOcean

DigitalOcean simplifies cloud computing so developers and businesses can spend more time building software that changes the world. Founded in 2012 DigitalOcean is headquartered in New York City with data centers worldwide. DigitalOcean combines the power of simplicity, community, open-source, and customer support, so customers can spend less time managing their infrastructure and devote more time to building innovative applications that drive business growth.

We offer a variety of computing, storage, and networking products and solutions that range from simple website hosting and virtual private networks to scalable virtual machines and tools for gaming development, video streaming, and database management all powered by cloud technology.

When DigitalOcean launched in 2012, other cloud providers were neglecting software developers and small and medium businesses (SMBs). The prices were too high, and the offerings were unnecessarily complicated. We focused on keeping our products simple and that continues to be our differentiator today. While other cloud providers have focused their energies on large enterprises and legacy companies, we are dedicated to building better tools for developers, SMBs, and startups who want to run and scale applications in the cloud.

In addition to providing a simple yet performant cloud platform, we offer consumption-based pricing that is simple and transparent. We have also focused on building a true community of developers to support them in building the products of the future. Our community site provides comprehensive tutorials and guides for developers learning to work with Linux and those seeking to create complex setups within their infrastructure. We also provide free 24/7 support to all our customers.

The companys offerings are developed with the entrepreneurial spirit in mind, and it is what sets DigitalOcean apart. We enable entrepreneurs around the world with technology that allows them to test an idea, grow that idea to a fully functioning business, then scale the business. We have built a global community of developers and business owners who come together to support each other. Millions of people come to our community for advice and tutorials. When you invest in a community, you create a stickiness that can pay dividends over the long term.

Cloud computing does most of the heavy lifting behind the scenes of application development and delivery. It frees up time for developers and businesses to focus on important things, such as developing the next generation of software that will change the world. The beauty of DigitalOcean is its simplicity to scale, both up and down, so that business owners and developers can utilize and use what they need to easily and affordably.

The pandemic has created a new tech ecosystem that has enabled small and medium-sized businesses (SMBs) to find new ways to adapt and innovate while accelerating a digital-first approach to engaging with customers. DigitalOcean gives developers better software and tools to launch and scale any application in the cloud. By offering the simplest platform, the most transparent pricing, and an elegant user interface, we reduce the complexity, so that the developers can build great software, faster. This way, startups, and SMBs can use our solutions to power their digital transformation projects and bring their business ideas to life. We also support CTOs by providing technical architecture reviews with our developer advocates and solution engineers.

We will continue to focus on building tools SMBs and developers enjoy using and build the most active and engaged community possible. Last year, we announced the acquisition of Nimbella, a serverless platform provider. Were focused on fully integrating the offering and look forward to launching it under the DigitalOcean brand this year. We will also continue to invest in our teams worldwide who work with our customers to understand their systems and help them with their migration to the cloud. Our goal is to empower developers and software companies around the world to build amazing things. Our robust, affordable, and simple infrastructure is making the cloud more accessible than ever. India is poised to innovate in a big way in the next decade, and we want to be there to support every business to grow and succeed.

These technologies are measuring and reporting data and data streams from billions of sensors and devices worldwide. By minimizing human intervention, the technologies provide organizations with improved transparency, predictability, and fast execution, which helps in reducing the cost of operations. They are benefitting from analyzing large volumes of data to see patterns and trends and draw informed conclusions.

The cloud market represents a huge opportunity. Globally, 14 million new companies are created every year. There are 50 million software engineers. Today, those groups spend about USD 50 billion on the cloud. That is expected to double in future years. DigitalOcean is well-positioned to make the most of this opportunity.

Share This ArticleDo the sharing thingy

About AuthorMore info about author

Analytics Insight is an influential platform dedicated to insights, trends, and opinions from the world of data-driven technologies. It monitors developments, recognition, and achievements made by Artificial Intelligence, Big Data and Analytics companies across the globe.

Continue reading here:
Exclusive Interview with Yancey Spruill, CEO of DigitalOcean - Analytics Insight

Amazon, Google or Microsoft? Boeing chooses all of the above for cloud computing services – GeekWire

Boeing engineers huddle over a computer to work on an aircraft design. (Boeing Photo / Bob Ferguson)

The billion-dollar competition to provide Boeing with cloud computing services is finished, and the winner is a three-way split. Amazon Web Services, Google Cloud and Microsoft are all getting a share of the business, Boeing announced today.

In a LinkedIn post, Susan Doniz, Boeings chief information officer and senior VP for information technology and data analytics, called it a multi-cloud partnership.

This represents a significant investment in the digital tools that will empower Boeings next 100 years, she wrote. These partnerships strengthen our ability to test a system or an aircraft hundreds of times using digital twin technology before it is deployed.

Doniz said that becoming more cloud-centric will provide Boeing with global scalability and elasticity without having to predict, procure, maintain and pay for on-premises servers.

Financial details relating to the multi-cloud partnership were not disclosed.

Historically, most of Boeings applications have been hosted and maintained through on-site servers that are managed by Boeing or external partners. You could argue Boeings extensive intranet blazed a trail for todays cloud computing services.

Marketing and performing computer services involves a whole new way of doing business, Boeing President T.A. Wilson declared in 1970 when Boeing Computer Services was formed.

In recent years, Boeing has been transitioning from its own aging computer infrastructure to cloud providers. For example, in 2016 the company chose Microsoft Azure to handle a significant proportion of its data analytics applications for commercial aviation.

At the time, that was considered a notable win for Microsoft, but Boeing also has maintained relationships with AWS, Google and other cloud providers.

Some had expected Boeing to pick a primary provider as a result of the just-concluded bidding process. Last year, The Information quoted its sources as saying that the deal could be worth at least $1 billion over the course of several years and that Andy Jassy, who is now Amazons CEO, saw it as a must-win for AWS.

But if Boeing is favoring one member of the cloud troika above the others, its being careful not to tip its hand publicly to such an extent that todays announcement consistently lists the three companies in alphabetical order. (If you happen to know who the big winner is, send us a tip.)

Update for 4 p.m. PT April 6: In an interview with Insider, Amazon Web Services senior vice president of sales and marketing, Matt Garman, discussed Boeings decision on parceling out the contracts for cloud computing services and said just about what youd expect an executive in his position to say.

Theyre announcing that theyre going to have a couple of different partnerships, Garman said. I think the vast majority of that will land with AWS.

Cloud services arent the only connection between Amazon and Boeing: In its news release about the Boeing cloud deal, Amazon notes that it has more than 110 Boeing aircraft in its Amazon Air delivery fleet.

The other two cloud titans are also talking: Microsoft noted that its been working with Boeing for more than two decades, and that todays deal will deepen the relationship. Meanwhile, Google emphasized its efforts to match 100% of the energy powering its cloud workloads with renewable energy, making it the cleanest cloud in the industry.

Read the original:
Amazon, Google or Microsoft? Boeing chooses all of the above for cloud computing services - GeekWire

The key is the cloud: How to keep up with the speed of innovation – CIO

At DISH Network, cloud-adoption strategies vary by when the parts of its business started from those born in the cloud to legacy sectors deploying cloud on an opportunistic basis. But one thing is clear to Atilla Tinic, the companys EVP and CIO: I do think the key is the cloud. He added: The strategy around cloud is not ROI on a case-by-case basis. Its a must if a company wants to stay relevant.

Tinic is among the speakers at CIOs Future of Cloud Summit, taking place virtually April 12-13. Focusing on speed, scale and software innovation, the event will gather technology executives to discuss both strategy and concrete implementation tactics.

The program begins April 12 and will cover aspects of cloud innovation and agility, with Jessica Groopman, founding partner at Kaleido Insights, kicking off the day with a look at three macrotrends reshaping cloud and software innovation. She will also field questions in a live discussion.

Throughout the day, CIOs from leading companies will dive into aspects of their strategy. Christopher Marsh-Bourdon, head of hybrid environments at Wells Fargo Bank N.A., will offer insights on designing hybrid cloud environments for security and flexibility. Shamim Mohammad, EVP and chief information and technology officer at CarMax, will present a case study on how the cloud enables the companys signature Instant Offer feature. Addressing how to maximize the value of every dollar spent in cloud will be Jennifer Hays, SVP of engineering efficiency and assurance at Fidelity Investments, along with FinOps Foundation Executive Director J.R. Storment.

Michael Riecica, director of security strategy and risk in Rockwell Automations Chief Information Security Office, will drill into the security aspects of cloud strategy. And hear how the U.S. Federal Reserve System leverages cloud smart strategies from System CIO Ghada Ijam.

James Cham, a partner at Bloomberg Beta, will offer a venture fund perspective on changes to watch in software development, deriving value from big data, and a view into where AI fits in. Cham will also lead a live discussion on cloud, and other technology investments, can be used as catalysts for building business value.

Another opportunity for interaction with peers and experts will take place in a workshop on cloud as a business strategy platform led by Kevin L. Jackson, CEO of GlobalNet and the host of Digital Transformers.

Hear from world-class analysts such as Dion Hinchcliffe, vice president and principal analyst at Constellation Research, who will preview what cloud will look like in five years and advise CIOs on how to address the challenges of fast change, while successfully dealing with talent scarcity and technical debt. IDCs Dave McCarthy, research vice president of cloud infrastructure services, will advise on how to navigate a future of digital infrastructure focused on cloud. He will cover application modernization, the best approach for multi-cloud deployments and where to invest in automation. McCarthy will follow up the presentation with a live discussion Wednesday on cloud trends.

Wednesday will focus on shifting into a cloud native future, starting with a tutorial on how to gain and maintain a competitive advantage from champion racecar driver Julia Landauer. Later, she will answer questions about habits and mindsets that drive success.

Priceline Chief Technology Officer Marty Brodbeck will share how the online travel agency sped up its cloud native software production. Meanwhile, Expedia Vice President of Development and Runtime Platform Robert Duffy will discuss how to become a results-driven cloud native organization with Cloud Native Computing Foundation Chief Technology Officer Chris Aniszczyk.

Looking to integrate cloud native apps into a seamless operational platform? Professional Case Management CIO Charlie Billings will share his organizations experience. In another session, Joseph Sieczkowski, CIO for architecture and engineering at BNY Mellon, will discuss cultivating an agile and dynamic operating model.

Finally, in a glimpse at whats to come, learn the hottest cloud-native software development trends from InfoWorld senior writer Serdar Yegulalp and Group Editor for UK B2B Scott Carey. In addition, Yegulalp will present a non-technical introduction to Kubernetes and best practices of managing container-based applications at scale.

Throughout the summit, sponsors including Cloudera, Freshworks and others will share innovative solutions for building your cloud strategy.

Check out the full summit agenda here. The event is free to attend for qualified attendees. Dont miss out register today.

Pictured above (left to right): Ghada Ijam, System CIO, Federal Reserve System; Atilla Tinic, EVP, Chief Information Officer, DISH Network; racecar driver Julia Landauer.

Excerpt from:
The key is the cloud: How to keep up with the speed of innovation - CIO

Will cloud computing be Canadas next big military procurement? Heres what to know – Global News

Ask most Canadians what the military needs next, and cloud computing might not be the first thing that jumps to mind.

But modernizing how Canadian security officials manage increasingly massive troves of data could be among the most important decisions of the coming years and federal officials have confirmed to Global News that preliminary work is underway.

Militaries are reflective of the societies they live in and a lot of the sort of development of how were going to fight wars in the future is stuff that we see in society today, which is large amounts of data management, said Richard Shimooka, a senior fellow at the Macdonald-Laurier Institute.

Its taking huge amounts of information and organizing and storing it away, and then actually applying them to conduct operations.

Story continues below advertisement

Canadian national security agencies and the military sit atop hordes of data that need to be continually tracked, assessed and managed in order to support the operations carried out to protect the countrys interests.

Increasingly though, those reams of data arent being stored just in filing cabinets or basements or bunkers. They sit in the cloud the digital ether that most Canadians likely know best as the safe haven for backing up old family photos or for syncing information between multiple devices.

As the amorphous nature of cyber warfare and cyber conflict have demonstrated over recent years, being able to gather, interpret, share and act on digital information is already a critical part of how militaries and national security agencies do their jobs in the 21st century.

Yet modernization has been a slow march for Canadian security actors, including the Canadian Forces.

Some of our systems and processes are dating back to the 50s. So [there is] crazy potential to upgrade that with not even modern practices, but to catch up to the 2010s, said Dave Perry, vice president of the Canadian Global Affairs Institute and an expert in Canadian defence policy.

Story continues below advertisement

It was a massive accomplishment to start using [Microsoft] Office 365 in recent years.

U.S. military cloud contracts are worth billions

Speculation about whether Canada could look toward a cloud computing contract comes amid plans south of the border to award a multibillion-dollar contract later this year for the Department of Defense.

Last summer, the U.S. Defense Department announced plans to award a contract in April 2022 for what it now calls the Joint Warfighting Cloud Capability.

That initiative aims to bring multiple American IT providers into a contract to provide cloud computing services for the military, and it replaces a single vendor program planned under the former Trump administration that was known as JEDI the Joint Enterprise Defense Infrastructure project.

Trending Stories

Story continues below advertisement

Last month, the Pentagon announced the JWCC contract wont be awarded until December 2022.

Microsoft and Amazon are believed to be frontrunners for different parts of that deal, while Google, Oracle and IBM have also expressed interest.

Some of those firms are now also lobbying Canadian officials to get similar contracts in place here.

Which firms are lobbying Canadian officials?

Google, IBM, Oracle and Microsoft did not have any lobbying listings with national security officials in recent months, although all list cloud computing as among their broader lobbying interests with officials with other departments including Treasury Board Secretariat, Justice Canada, and Natural Resources.

Amazon Web Services does have recent records filed disclosing lobbying with national security agencies and officials, one of its listed interests being seeking contracts with multiple government departments and institutions with regards to Amazon Cloud based solutions and related support services.

Story continues below advertisement

The web giant also has job postings up for working on its push to get cloud computing into Canadian government departments, including an account manager. That role is tasked with increasing adoption of Amazon Web Services by developing strategic accounts within Canadas Federal Government National Security sector.

According to lobbyist filings, Eric Gales, president of the Canadian branch, had meetings with Michael Power, chief of staff to Defence Minister Anita Anand, on Feb. 19, 2022, and one day earlier had met with the acting assistant deputy minister of Shared Services Canada, Scott Davis.

He also metwith Sami Khoury, head of the Canadian Centre for Cyber Security, on Nov. 17, 2021.

The Canadian Centre for Cyber Security is part of the Communications Security Establishment, Canadas signals intelligence agency and the body tasked with protecting the Government of Canadas IT networks.

A spokesperson for the CSE confirmed early work on the matter is underway,

Story continues below advertisement

The evolving information technology (IT) world is moving to cloud-based services. We are aware that our closest allies have, or are acquiring classified cloud capabilities, and we continue to engage in conversations with them on security requirements to maintain interoperability, Evan Koronewski said.

The Government of Canadas security and intelligence community is engaged in preliminary research, exploring the requirements for classified cloud services.

He added officials are exploring security requirements with the Treasury Board Secretariat, Shared Services Canada, and the Department of National Defence.

A spokesperson for the latter also confirmed that the military is working on incorporating more cloud capabilities, though not yet for classified material.

We recognize that cloud computing offers key benefits in terms of IT efficiency, said Dan Le Bouthillier.

DND/CAF is building its cloud capacity and has adopted a Multi-cloud Strategy with multiple vendors, namely Microsoft, Amazon Web Services, and Google.

He added the goal is to strike the right balance between agility and security.

The website for Shared Services Canada, which handles IT services for government departments, states there are framework agreements for cloud computing in place with eight providers: Google Cloud, ServiceNow, IBM Cloud, Oracle, ThinkOn, Microsoft and Amazon Web Services.

Those will let departments contract cloud services as they need through those providers.

Story continues below advertisement

The U.S. military cloud computing contract is valued at US$9 billion, or $11.2 billion.

Its not clear how much a similar solution for national security agencies here could cost.

Both Prime Minister Justin Trudeau and Defence Minister Anita Anand have suggested in recent weeks that the government is weighing an increase to defence spending, moving it closer to the NATO target, which aims to see all members of the military alliance spend at least two per cent of GDP on defence.

Canadas current defence spending sits at 1.39 per cent of GDP.

To hit the two per cent target would require approximately $16 billion.

That would be above the increases currently projected under the governments 2017 plan to boost defence spending, which will see it rise to $32.7 billion by 2026/27 from $18.9 billion in 2016/17.

Originally posted here:
Will cloud computing be Canadas next big military procurement? Heres what to know - Global News

Cloud or Mainframe? The Answer is Both – IBM Newsroom

Cloud or Mainframe? The Answer is Both

By John Granger | Senior Vice President of IBM Consulting

April 06, 2022

To respond to the ongoing pressures of the global pandemic, businesses around the world have turbo-charged their digital transformations. Everywhere you look, companies face an acute need for speed to market, flexibility, nimbleness and, of course, ongoing innovation.

These priorities are why companies are looking to take advantage of cloud computing. But it is not straightforward; it's not just the hop to public cloud. Clients have issues of security and data gravity, of complex systems that are expensive to migrate. Strategically, they have concerns about optionality, about lock in, about discovering that their cloud providers have just become their competitors. These realities explain why so few clients have made a wholesale move to cloud.

The unique needs each company faces in their business transformation journey require a diverse mix of applications and environments including traditional data centers, edge computing and SaaS. What is the role of the mainframe in todays IT infrastructure?

According to a recent IBM study*, the vast majority (a whopping 71%) of IT executives surveyed from major corporations across seven industries say critical mainframe-based applications not only have a place in their IT platforms today but are central to their business strategy. And in three years, the percentage of organizations leveraging mainframe assets in a hybrid cloud environment is expected to increase by more than two-fold. Four of five executives say their organizations need to rapidly transform to keep up with competition, which includes modernizing mainframe-based apps and adopting a more open approach to cloud migration.

A hybrid cloud approach that includes and integrates mainframe computing can drive up to five times the value of a public cloud platform alone and the main sources of value are in five categories: increased business acceleration, developer productivity, infrastructure efficiency, risk and compliance management, and long-term flexibility. With the billions of dollars our clients have invested in business-critical mainframe applications like financial management, customer data and transaction processing over the years, this strategy holds true for IBMs global consulting practice. Our clients primary goal is to modernize those existing investments and minimize risk while delivering hybrid cloud innovation.

Digital transformation is not an either-or process. We guide our clients on the application modernization journey with these key recommendations:

First, adopt an iterative approach. Many enterprises are experiencing firsthand the complexity of their IT estates. Continuing to add to the existing vertical cloud silos is undercutting their flexibility by making processes related to development, operations, and security even more fragmented than before and cloud fragmentation makes it virtually impossible to achieve the standardization and scale that cloud promises to deliver. Therefore, part of your plan to integrate new and existing environments must factor in your industry and workload attributes to co-create a business case and road map designed to meet your strategic goals. Adopt an incremental and adaptive approach to modernization as compared to a big bang. Leverage techniques such as coexistence architecture to gradually make the transition to the integrated hybrid architecture.

Then, assess your portfolio and build your roadmap. To understand your desired future state, first assess your current state. Examine the capabilities that define the role of the mainframe in your enterprise today and how those capabilities tie into the greater hybrid cloud technology ecosystem. In addition, take stock of your existing talent and resources and determine any potential gaps. For IBMs consulting business, the partnership and role that IBM Systems plays is fundamental for the simple reason that solutions such as the new IBM z16 perform many of the critical functions underpinning a truly open and secure hybrid cloud environment. These functions include accessing troves of unstructured on-premises data across a hybrid cloud platform, scaling and automating data-driven insights with AI, and being agile enough to process critical apps and data in real-time all while assessing security risks. Storing data across multiple clouds and moving it between partners and third parties can leave companies more vulnerable to security issues such as data breaches. Assessing infrastructure solutions that support the ability to protect data even when it leaves your platform is crucial.

Finally, leverage multiple modernization strategies and enable easy access to existing mainframe applications and data by using APIs. This means providing a common developer experience by integrating open-source tools and a streamlined process for agility, in addition to developing cloud native applications on the mainframe and containerizing those applications.

IT executives expect significant usage increases in both mainframe (35%) and cloud-based applications (44%) over the next two years.

Consider how you can extract more value from both your mainframe and cloud investments. Blending mainframe power, reliability and security into the cloud landscape is essential to achieve the enterprise-wide agility and capability required to keep pace with ever-changing business needs.

* Study Methodology: Oxford Economics and the IBM Institute for Business Value surveyed 200 IT executives (for example, CIOs, Chief Enterprise architects) in North America.

Read the original post:
Cloud or Mainframe? The Answer is Both - IBM Newsroom

What Would Happen to Cloud Computing Careers in Next 5 Years? Wil They Perish? – Analytics Insight

Let us see if cloud computing careers have a chance or will die in the next 5-years?

Cloud computing is undoubtedly the hottest field for any technological worker right now. The cloud has become a popular buzzword among both large and small businesses. The adoption of the cloud by the rest of the globe has resulted in a tremendous increase in cloud-related jobs.

Cloud Computing allows us to link anything digitally nowadays. It opens up a whole new universe of opportunities in terms of jobs, applications, services, and platforms. Cloud computings future can be seen as a mix of cloud-based software and on-premises computing, which will aid in the creation of hybrid IT solutions.

Because there is so much competitiveness among cloud providers, more data centres will be available at a reduced cost. We can save data in the cloud with the use of IoT (Internet of Things) and Cloud Computing for further analysis and improved performance. The network offered will be speedier, and data will be received and delivered more quickly.

One can reach their respective objectives with the help of this service. Many studies have shown that Cloud Computing will be one of the most important technologies in the future, with software as a service solution accounting for more than 60% of all workload.

Cloud has a number of advantages that make its future in the IT industry more promising. The redesigned cloud is scalable and versatile, allowing for data center security and control. An organized procedure and a better technique for processing data will be important aspects of cloud computing.

With a staggering 1828 percent rise in employment in the last year, cloud computing and management services are the fastest growing employer in the software business. Second place went to IT Services and Consulting, which saw a relatively tiny 355 percent rise in personnel.

When you realize that these data only include 5 firms whose core business is cloud computing, not positions within corporations, the true number of cloud jobs is substantially larger.

Software Engineer, Java Developer, Systems Engineer, Network Engineer, Systems Administrator, and Enterprise Architect are some of the most common cloud occupations. However, you dont have to be an engineer to be a part of this rapidly growing job market; other roles requiring cloud knowledge, such as marketing managers or market and research analysts, are in high demand.

The most common prerequisite for technical cloud employment is cloud computing. Other common criteria include Oracle Java, Linux, Structured Query Language (SQL), UNIX, Software as a Service (SaaS), Python Extensible Programming Language, and the like.

With concerns about cloud security dissipation, an increasing number of businesses are preparing to migrate to the cloud. According to a recent report, the number of jobs created by cloud computing would rise from 6.7 million to 13.8 million by 2013. Creating a job growth of 108 percent in the United Kingdom alone.

There has never been a better moment to be a cloud expert, with more cloud positions than cloud technicians to fill them. It is getting increasingly difficult for firms to find the talent they require. To ensure that you obtain the best personnel, you must know where to look and how to time your jobs efficiently.

So definitely there can be seen a service performance gap in this specific career field, due to the higher expected cloud computing skills of the organizations and the young cloud personnel or cloud technicians. But it can be assuredly said that within the next 5 years, Cloud Computing careers will only flourish and certainly not die.

Cloud computing is strong and expansive, and it will continue to grow and give many benefits in the future since it is incredibly cost-effective, and businesses can use it to grow. Thus Cloud computing careers have a bright future ahead, with benefits for both the host and the customer. Although its important to remember that the companys owner should be up to date on the latest developments in cloud technology.

Share This ArticleDo the sharing thingy

Follow this link:
What Would Happen to Cloud Computing Careers in Next 5 Years? Wil They Perish? - Analytics Insight

The future of work in the heterogeneous diverse cloud – BCS

Today, we have seen the proliferation, popularisation and eventual propagation of cloud computing, mobile device ubiquity and new algorithmically-enriched approaches to Artificial Intelligence (AI) and Machine Learning (ML)... all of which have further changed the nature of work.

The sum consequence of much of the development on the post-millennial technology curve is a new approach to digitally-driven work. To explain this shuddering generalisation, digital work means tasks, processes, procedures and higher-level workflows that can be encoded into data in order for their status to be tracked, analysed and managed.

Part of the total estate of big data that now hovers over all digital assets in the modern workplace, digital workflows can now be built that are more intelligently shared between humans and machines.

Where processes are accurately definable, typically repeatable and easily replicable, we now have the opportunity to use autonomous software controls such as Robotic Process Automation (RPA) and chatbots to shoulder part of our daily tasks. Although there is a period of process mining and process discovery that we need to perform before we can switch on the autonomous advantage, once we do so we can start to focus human skills on more creative higher-value tasks.

Where all of this gets us is to a point where we can be intelligently granular about how we place elements of our total digital workload across data services, across application resources, across cloud backbones and ultimately, across people.

To enable digital work, we still have some challenges to overcome i.e. we need to be able to communicate between each other as humans and machines in a consistent yet essentially decoupled way. Because not every work task has had its genome decrypted, we are still searching for ways to encapsulate certain aspects of enterprise workflows.

This is tough because were aiming towards a moving target i.e. market swings and the dynamism of global trade. But, as we start to build new work systems, we can start to operate workflows that are intelligently shared across different interconnected cloud services, for a variety of core reasons.

Enterprises can now create a layered fabric of work elements and functions shared across different Cloud Services Providers (CSPs), sometimes separated-out on the basis of different cloud contract costs, sometimes for reasons related to geographic latency or regulatory compliance, or often dispersed across more than one cloud due to the various optimisation functions (processing, storage, transactional Input/Output capability, GPU accelerated etc.) that exist in different services.

If private on-premises cloud combined with public cloud is what we now understand to be the de facto most sensible approach we know as hybrid cloud, then this (above) deployment scenario is one move wider. Where workloads are placed across clouds, we are in hybrid territory; but where individual data workflows are dispersed across and between different cloud services, we get to poly-cloud.

The architectural complexity of interconnected cloud services that are established around these terms is not hard to grasp. In order to make this type of lower substrate diversity manageable, cost-effective and above all functional, enterprises will need to embrace a platform-based approach to hyperconverged cloud infrastructure.

Most organisations struggle to effectively manage heterogeneous cloud environments and move workloads back and forth between and among them. Establishing visible benefits from this type of approach to cloud is only possible if the business is able to think of its cloud infrastructure as an invisible foundational layer.

Managing a multi-cloud and poly-cloud infrastructure means being able to simplify cloud management and operations requirements across an enterprises chosen estate of interconnected cloud services. With different providers all offering different software toolsets, different management dashboards, different configuration parameters and so on, there is no point-and-click solution without a hyperconverged higher platform layer in place.

As theoretical as some of the discussion here sounds, many practical examples already exist. South Africas largest bank Nedbank has been bold with its cloud-based approach designed to cope with cost-effectively delivering upon its diverse bandwidth requirements.

Needing low-latency remote worker provision for its 2,000-strong developer function in India (but capable of straddling less performant latency parameters for other functions), the company had to build systems capable of superior service that would be a win-win for staff and customers alike.

Read this article:
The future of work in the heterogeneous diverse cloud - BCS

UCL and AWS partner to launch digital innovation centre – Healthcare IT News

University College London (UCL) and Amazon Web Services (AWS) are joining forces to launch a centre for digital innovation.

The centre, to be hosted at the IDEALondon technology hub, will help healthcare and education organisations to accelerate digital innovation and address global issues in the sectors.

It has sent out its first call for engagement via the Impact Accelerator, a programme offered by UCL and AWS which aims to boost startups by providing advice, education and funding initiatives.

Successful applicants to the UCL Centre for Digital Innovation (CDI)will be supported to build a product prototype or help design it for scalability if one already exists. AWS will also provide AWS credits of up to $500,000 (370,000) per year to help fund development of prototypes and new solutions.

Healthcare and education organisations, research teams, startups and UCLs technology spinouts are eligible to apply.

WHY IT MATTERS

Projects should look to solve a global issue in health or education using cloud computing and have a real user and customer in mind. The aim is to produce evidence-based, commercially sustainable technological innovations.

UCL will draw on expertise from several faculties including medical sciences, IOE, engineering and life sciences, and experts from University College London Hospital (UCLH).

Successful applicants will be given access to training, education and technical support, including a resident AWS solutions architect, domain experts in health tech and ed tech, and immersion days on specialist topics.

THE LARGER CONTEXT

AWS launched its firstaccelerator programmefor UK-based digital healthcare startups in 2021. It recently announced 12 innovatorswho have been selected to take part in the programme.

Last year AWS announced plansto open an infrastructure region in the United Arab Emirates (UAE) in the first half of 2022. The expansion is part of AWS's aims to build on its current 80 availability zones globally across 25 geographic regions.

ON THE RECORD

UCL CDI director Graa Carvalho (UCL Faculty of Engineering Sciences) said: This ambitious collaboration brings together the strength of UCL and AWS to build trust within digital innovation, allowing hospitals, universities, patients, students, research teams and UCL spinouts to use cloud-based technology to compete on a global stage.

John Davies, director, regional government, UK, worldwide public sector at AWS, said: By bringing together UCLs world renowned academic rigour with AWS cloud technologies and culture of innovation, we hope to provide healthcare and education organisations with a springboard to help them to address some of the toughest challenges facing society right now.

UCL pro-vice-provost (AI), Professor Geraint Rees, said: Innovative digital solutions to the worlds problems are best created in collaboration between academic and commercial organisations. The UCL CDI, powered by AWS, combines the best of both domains. We believe that this combined endeavour will lead us to solutions that are evidence based, commercially sustainable and focus on the needs of the worlds citizens.

Originally posted here:
UCL and AWS partner to launch digital innovation centre - Healthcare IT News

BrainChip, SiFive partner to bring AI and ML to edge computing – VentureBeat

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!

AI processor maker BrainChip, which makes ultra-low-power neuromorphic chips and supporting software, and SiFive, founder of the RISC-V computing genre, today announced they have combined their respective technologies to offer chip designers optimized artificial intelligence (AI) and machine learning (ML) for edge computing

BrainChips AI engine, Akida, is an advanced neural networking processor architecture that brings AI functionality to edge and cloud computing in a way that wasnt previously possible with its high performance and ultra-low power usage, the company said. SiFive Intelligence solutions, with their highly configurable multi-core, multi-cluster capable design, integrate software and hardware to accelerate AI/ML applications, BrainChip CMO Jerome Nadel told VentureBeat.

The integration of BrainChips Akida technology and SiFives multi-core capable RISC-V processors is expected to provide an efficient solution for integrated edge AI computing, Nadel said.

RISC-V (pronounced risk-five) is an open instruction-set computing architecture based on established reduced instruction set computing (RISC) principles. Its an open-source project available to anybody who wants to use it. RISC-V represents a major step forward in data processing speed that is required of all the new and much heavier applications (such as machine learning, AI and high-resolution video) that are coming into daily use. RISC-V appears to be a natural fit for BrainChips architecture for neural networking processors.

SiFive Intelligence-based processors have a highly configurable multi-core, multi-cluster-capable design that has been optimized for a range of applications requiring high-throughput, single-thread performance while under tight power and area constraints, Nadel said.

BrainChips Akida mimics the human brain to analyze only essential sensor inputs at the point of acquisition, processing data with efficiency, precision, and economy of energy, Nadel said. Keeping AI/ML local to the chip and independent of the cloud reduces latency while improving privacy and data security, he said.

BrainChips technology is based on its spiking neuron adaptive processor (SNAP) technology and licenses SNAP with technology partners. SNAP offers a development solution for companies entering the neuromorphic semiconductor chip market. It is a core-enabling technology in neuromorphic semiconductor chips that enables various applications, such as gaming, cybersecurity, robotic technology and stock market forecasting, among others.

As we expand our ecosystem of portfolio partners, we want to be sure that our relationships are built on complementary technologies, enabling capabilities and breadth of environments so that we can expand opportunities to as many potential customers as possible, Nadel said. Driving our technology into a SiFive-based subsystem is exactly the type of partnership that meets these goals.

VentureBeat asked Jack Kang, senior vice president of Business Development, Customer Experience (CX), Corporate Marketing at SiFive, a few specific questions about the news and the relevance of the partnership.

VentureBeat: What is the no. 1 business takeaway from this announcement?

Jack Kang: For SiFive, this announcement shows the ongoing uptake of the SiFive Intelligence family of RISC-V-based processor IP. More companies are choosing RISC-V to be part of their product roadmap strategy, and SiFive is the leading provider of commercial RISC-V IP. In the emerging greenfield markets of AI/ML-enabled platforms, such as the edge processing market targeted by BrainChip, the performance per area and efficiency advantages of SiFive processor architecture make the SiFive Intelligence family a competitive choice.

VentureBeat: Does BrainChip use any of Arms IP in its chips? Arm is known for low power and high performance.

Kang: BrainChip has discussed Arm IP for their product line. Arm processors have built a reputation for low power based on comparisons to x86-based products. SiFive Intelligence products compare well to Arm products through offering improved performance-per-area of up to 30%, combined with a single ISA for simpler programming, and a modular approach that aligns well to working with hardened AI IP such as that developed by BrainChip.

VentureBeat: Can you expand upon this statement: (Brainchip) mimics the human brain to analyze only essential sensor inputs at the point of acquisition.

Kang: This statement refers to the ability of humans to focus on whats important. For example, listening to a conversation in a coffee shop while still registering and acknowledging background sounds. The BrainChip solution will mimic this ability to reduce power and increase efficiency by focusing on the important data being processed. This is similar to, but a step beyond, the adoption of mixed and lower precision data types (INT8 vs. FP16) to speed up and improve the efficiency of AI/ML processing.

Brainchip, based in Aliso Viejo, California, competes in the burgeoning intelligent-edge chip market with Nvidia Deep Learning GPU, Keras, TFLearn, Clarifai and Microsoft Cognitive Toolkit, AWS Deep Learning AMIs and Torch. Nvidia owns about 80 percent of the global GPU (graphics processing unit) market. G2.com has market information here. Availability of the new SiFive/BrainChip solutions will be announced at a later date.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Read the rest here:
BrainChip, SiFive partner to bring AI and ML to edge computing - VentureBeat

Real Words or Buzzwords: Edge cloud and the evolving internet – SecurityInfoWatch

Editors note: This is the 60th article in the Real Words or Buzzwords? series about how real words become empty words and stifle technology progress.

The original Internet that we built was designed based on decades-old technology (originally dial-up telephone lines) and released for use to a society who had little exposure to computer technology and for many people that was mostly at work. As the use of websites expanded, information technology continued to advance. The core of the Internet was expanded to keep up with demand, with high-speed networking and computer virtualization facilitating and accelerating its growth.

Internet expansion continued chiefly under the original Internet architecture, which became increasingly problematic with the arrival of mobile devices operated by humans, and connected devices that didnt require human operation (the Internet of Things - IoT).

As of 2018, 90% of the worlds data had been created in the prior two years. Every day, we create roughly 2.5 quintillion bytes of data. With the explosive growth of IoT, this data creation rate will become even greater. And its all happening at the edge of the Internet.

Synergy Research says more than 100 hyperscale facilities were built in 2020, making the total number close to 600. As of January 2021, Amazon, Microsoft, and Google collectively accounted for over half of all major data centers and continue to be significant drivers of data center growth. Data centers continue to be located where land and electricity are cheap.

Thats a significant factor in those three companies all investing significantly in data centers in Idaho. In February of 2022, Meta (formerly Facebook) announced plans to build an $800M hyperscale campus in Idaho. As part of that move, it agreed to buy Iowa Wind Farm's entire capacity to power its data center campus.

Thus, todays data centers are as remote as they can get from the majority of the end-users and the IoT devices that generate the vastly increased amounts of data needing processing. They arent solving the mobile and IoT Internet user problem.

This is why one year ago, Vapor IO and VMWare announced the formation of the Open Grid Alliance (OGA), joined by Dell Technologies, DriveNets, MobiledgeX, and PacketFabric as founding members. The OGA, now with 32 members, proposes vastly increasing the number of small edge data centers along with the number of direct connects to the Internet.

Consider, for example, a 5G and 6G wireless future with affordable and highly available high-speed fiber-optic networking running to an edge data center at nearly every cell tower. That would mean microsecond transaction times for high volume IoT data processing at the edge such as for security and retail operation video analytics and manufacturings production line machine vision.

It would also mean high-speed high-volume wireless IoT data connections, which is critical for autonomous vehicle safe operation and for city traffic management. And its a crucial need for smart cities and smart buildings, which todays affordable technology is bringing more broadly within reach.

The original internet had only a few dozen and then a few thousand end users. As of January 2022, 4.95 billion (roughly 62.5% of the worlds population) people use the Internet. Although there have been many government and private projects around the world named Next Generation Internet over the past two decades all were based on previous-era technology.

This is why the Linux Foundation has an Edge computing project as do hundreds of major IT domain players, why IT and business analyst firms have been paying attention, and why a Google search on edge computing (in quotes) generates over 17 million search results.

So, it should be no surprise that at ISC West 2022, two advanced technology IT companies announced themselves to the physical security industry. Vapor IOs Kinetic Grid Platform brings both high-speed networking and cloud computing resources to establish highly affordable edge computing.

The design of the Kinetic Grids resource location considers specific geographies, population centers, and fiber routes to offer low-latency colocation and connectivity in service of first and last-mile networks and nearby premises. Each Kinetic Edge market becomes part of a nationwide Kinetic Grid via private fiber backbones that connect across markets, offering built-in edge-to-edge capabilities.

Hypersive, an IT company whose founders have deep IT and physical security operations expertise and deployment experience, delivers current building management and physical security applications as a service, in the cloud, on-premises, or near-premises using Vapor IOs Kinetic Grid to optimize both cost and performance of high-data-volume systems and systems with many integration points, making service cloud integrations for on-premises systems doable. Hypersive's first offering is Milestone XProtect VMS as a service, simplifying deployments for integrators and facilitating high-performing and affordable XProtect system expansion to any site location. This allows integrators and end-users to focus on how to best optimize their use of the XProtect VMS, without having to pay attention to deployment details, including camera licenses.

VMS server deployments that would formerly have been complex and taken weeks and months, can now be accomplished in days, with high availability now possible for every site deployment regardless of its size or location.

Along with the Internets expanded architecture for edge computing come several IT terms that, while not new, have meanings that are different from what their English terms seem to state. That will be the subject of the next Real Words or Buzzwords? article.

About the author: Ray Bernard, PSP CHS-III, is the principal consultant for Ray Bernard Consulting Services (RBCS), a firm that provides security consulting services for public and private facilities (www.go-rbcs.com). In 2018 IFSEC Global listed Ray as #12 in the worlds Top 30 Security Thought Leaders. He is the author of the Elsevier book Security Technology Convergence Insights available on Amazon. Follow Ray on Twitter: @RayBernardRBCS.

2022 RBCS. All Rights Reserved.

The rest is here:
Real Words or Buzzwords: Edge cloud and the evolving internet - SecurityInfoWatch