Page 2,007«..1020..2,0062,0072,0082,009..2,0202,030..»

Two Edge Servers from Inspur Information Win 2022 Red Dot Awards – Business Wire

SAN JOSE, Calif.--(BUSINESS WIRE)--Inspur Information, a leading IT infrastructure solutions provider, was honored to have two products in its Edge microserver portfolio win recognition for excellence from Red Dot Award: Product Design 2022. EIS800 and Swing P101 were both winners, lauded for their clean and compact industrial design, modularity and ease of use.

The Red Dot Design Award is an internationally recognized seal of quality and good design with a history that spans more than 60 years. 48 experts from around the world serve as the Red Dot Jury, which meets yearly to identify the best new entries in Product Design, Brands & Communication Design and Design Concept. Inspur Information competed in the original and most competitive category, Product Design. The Red Dot Jury follows the motto In search of good design and innovation, and assesses each product individually and comprehensively to identify the entries with the most outstanding design quality.

An adaptable and minimalist package for Edge computing

EIS800 is an intelligent portable and easy-to-deploy Edge microserver for the digital era. Its specifically designed extensible modules and interfaces achieve high customizability, which allows for rapid deployment across a wide variety of scenarios for intelligent Edge computing. This adaptability combined with its wide variety of wireless communication protocols including ZIGBEE, 4G/5G, WIFI/Bluetooth, and GPS, make it an ideal candidate for nearly any environment or situation. Its structural design is both simple and compact. The housing is manufactured with rugged die-cast and anodic oxidation techniques to provide heat dissipation along with water and dust resistance. The IP65 protection level rating and temperature rating of -40C~70C ensures normal operation in a wide range of harsh edge environments.

An easily modifiable Edge solution

Swing is a highly integrated edge server that improves the computing efficiency of edge AI inferencing scenarios. It can provide more compact and efficient AI computing power for AI teaching assistants in universities, AI software algorithm development, intelligent medical scenarios such as medical image recognition and disease screening, and other applications. Two expansion card slots can be quickly customized with various GPU, ASIC and FPGA accelerator cards. This customization allows optimal functionality in various setup scenarios to quickly complete targeted application inference and computing architecture designs. It features a metallic finish from its anodized surface. The grille-like design is extremely minimalistic, rendering a sleek look that is excellent at heat dissipation. All of these features work together to enhance product development in a clean and simple package.

We are thrilled to have EIS800 and Swing P101 be recognized for their superior product design by Red Dot, said Park Sun, General Manager of Edge Computing, Inspur Information. We are excited to introduce the value of these products to customers. The acceleration of digital transformation has created huge amounts of real-time data that needs to be collected and processed in edge environments. An increasing amount of AI processing workloads require more flexible and more distributed solutions. EIS800 and Swing make that possible.

About Inspur Information

Inspur Information is a leading provider of data center infrastructure, cloud computing, and AI solutions. It is the worlds 2nd largest server manufacturer. Through engineering and innovation, Inspur Information delivers cutting-edge computing hardware design and extensive product offerings to address important technology sectors such as open computing, cloud data center, AI, and deep learning. Performance-optimized and purpose-built, our world-class solutions empower customers to tackle specific workloads and real-world challenges. To learn more, visit https://www.inspursystems.com.

Read the original here:
Two Edge Servers from Inspur Information Win 2022 Red Dot Awards - Business Wire

Read More..

How to Unblock Twitter in Russia in 2022 [Avoid the Russian Ban] – Cloudwards

Twitter is a trusted online news and social media platform. As such, its ban in Russia is part of the governments coordinated efforts to stifle the free flow of information and step up the spread of propaganda about the Ukrainian invasion through state-run media outlets. In this guide, well show you how to unblock Twitter in Russia to open the door to accurate and independent information.

As a prerequisite, youll need a virtual private network (VPN) to successfully bypass Twitter geo-restrictions in Russia. However, if you havent signed up for a VPN yet, fret not. Well reveal tried-and-tested VPNs that bypass the Twitter ban with ease, so keep reading. If youre just looking for a quick and easy answer, then check out ExpressVPN.

No law in Russia outrightly bans the use of VPNs within the country, but its illegal to use VPNs to access blocked content. That said, the Putin regime has blocked popular VPN services for failing to cooperate with its censorship efforts.

ExpressVPN is the best VPN for Russia and the best service to unblock social media in this restrictive country. For these reasons, well use it in our guide to demonstrate how to get around the Twitter ban. Unblocking Twitter in Russia with a VPN is as easy as following the steps below:

Access to reliable and independent information is a fundamental human right and no country should bar its citizens from enjoying this right. However, in some restrictive countries such as Russia, that right isnt a guarantee.

The government wants to be on top of the content Russian citizens consume, especially during its war with Ukraine. To achieve that, it has intensified efforts to block social media providers and international news media. The ban is geared toward isolating Russians from each other and the rest of the world, leaving them with no choice but to rely on state-affiliated news outlets for information.

However, while the move to block Twitter works in the governments favor, its a stumbling block for ordinary Russian citizens, marketers, advertisers and brands in Russia that rely on Twitter.

The Russian authorities began throttling Twitter on Feb. 26, 2022. The government restricted the social media network on leading Russian telecommunication companies including Beeline, MegaFon, Rostelecom and MTS.

The throttling made the site slow, making it nearly impossible for Twitter users to send tweets. Twitter confirmed the stripped access and claimed to be looking for the ideal workaround before the situation morphed into full-scale restriction.

The Twitter restriction came as Russia passed a new draconian law in an attempt to further crackdown on protests. As per the new law, independent journalists and citizens caught spreading fake information that contradicts the Russian governments narrative on the Ukrainian war risk a prison sentence of up to 15 years.

Twitter kept its promise to find a way around the ban by embracing the dark web with a Tor service to outfox the Russian authorities. We cant guarantee the effectiveness of the new Twitter feature, but as far as we know, Tor browser or network isnt as effective as a VPN in these situations.

Keep in mind that bypassing the Twitter ban in Russia isnt all about accessing the social media platform. Besides finding your way around the ban, you have to dust your digital trails just in case the Russian authorities decide to pursue you.

Both Tor and VPN hide your real IP address and location and can get you into blocked social media networks. However, the Tor network hides who you are but isnt as effective when it comes to safeguarding your digital privacy. By contrast, a VPN emphasizes privacy: It hides your identity, your true location and what you do online. Read our VPN vs proxy vs Tor guide to learn more.

A VPN provides digital security and privacy, allowing you to overcome attempts by the Russian government to strangle the free flow of information on Twitter.

ExpressVPN is the best VPN for Russian users to access any social media platform.

ExpressVPN is the best VPN service out there, and its excellent security and privacy make it a great option to access Twitter in Russia. It has over 3,000 servers in 94 countries, including Russian neighbors such as Norway, Poland and Finland, that you can connect to for optimal performance. Each connection utilizes the virtually unbreakable AES-256 encryption to keep your internet traffic away from prying eyes.

Besides that, theres a kill switch that ensures every bit of your online traffic passes through the encrypted tunnel. The DNS leak protection prevents IP leaks that could tip off the Russian authorities about your real location, whereas the TrustedServer technology ensures the VPN servers wipe user data with every reboot.

ExpressVPN abides by its strict no-logs policy, meaning it would have no data to share if coerced or subpoenaed by the Russian authorities. Read our exhaustive ExpressVPN review for more details. Theres also a 30-day money-back-guarantee, meaning you can try it risk-free.

NordVPNs Onion over VPN lets you route your internet traffic through the Tor network and a VPN for an extra layer of security.

NordVPN matches ExpressVPN performance in many aspects, except speed. It has over 5,400 servers spread across 60 countries, including some Russian neighbors: Finland, Poland, Latvia and Norway. Moreover, it uses the AES-256 encryption standard and comes with a kill switch and DNS leak protection.

What sets NordVPN apart from the other two VPNs on our list is its suite of advanced security features such as double-hop servers. As the name suggests, the servers route your internet traffic through two servers, adding an extra layer of protection. These specialty servers come in handy if you want to share sensitive information without the fear of government eavesdropping.

In addition, NordVPN has obfuscated servers that let you use a VPN in heavily restrictive environments. It also comes with a strict no-logs policy, and although it has suffered a security breach before, its still a trustworthy provider. Read our comprehensive NordVPN review for more. Its also a bit cheaper than our top pick, which makes it a better option for those on a budget.

The NoSpy servers are quite adept at bypassing censorship and surveillance in restrictive environments.

CyberGhost is another reputable VPN to unblock Twitter in Russia. Like our first two picks, it has all the basic features, including AES-256 encryption, a kill switch and DNS leak protection. In addition, it has a fleet of over 7,700 servers in 91 countries, including Russia. However, you dont need a Russian IP address to access Twitter in Russia.

What distinguishes it from other VPN services is the NoSpy servers. These are anti-surveillance servers built for use in restrictive countries to ward off third-party meddling and monitoring. Moreover, CyberGhost plays by its no-logs policy and has no history of a security breach. Finally, CyberGhost is one of the most affordable VPNs out there, as you can read in our full CyberGhost review.

We dont vouch for free VPNs because most of them are unreliable. Remember: When it comes to unblocking Twitter in Russia, you need a reliable VPN provider with top-notch security and privacy. In most cases, Free VPNs lack the robust VPN features (servers, protocols and encryption) needed to bypass network restrictions.

That said, not all free VPNs are unreliable. For example, our best free VPNs Windscribe and ProtonVPN guarantee excellent security and may be an option if you want to access Twitter in Russia. Sadly, they come with usage limits.

With Windscribe you get 10GB per month, which might be enough if you only want to tweet and read tweets. You also get access to 11 server locations out of 25. ProtonVPN, on the other hand, gives you unlimited free data, but only lets you use servers in three countries.

Peace talks between Russia and Ukraine are ongoing, and we hope the two nations will find a truce in the coming days. As it stands, the ground is becoming hostile to social media platforms. Meta, Facebooks parent company, had its Facebook and Instagram platforms banned for allegedly committing extremist activities.

If the recent censorship spree is anything to go by, then we can confidently say the Russian government isnt going to lift the ban on Twitter anytime soon. For this reason, you have to arm yourself with the best VPN to overcome censorship. We recommend getting started with ExpressVPN, thanks to its excellent security and privacy. NordVPN and CyberGhost are cheaper alternatives.

Have you used a VPN to access Twitter in Russia? Which VPN service did you use? Are you satisfied with the performance of that VPN? Wed like to hear about it in the comment section. As always, thanks for reading.

Let us know if you liked the post. Thats the only way we can improve.

YesNo

Follow this link:
How to Unblock Twitter in Russia in 2022 [Avoid the Russian Ban] - Cloudwards

Read More..

Amazon, Google or Microsoft? Boeing chooses all of the above for cloud computing services – GeekWire

Boeing engineers huddle over a computer to work on an aircraft design. (Boeing Photo / Bob Ferguson)

The billion-dollar competition to provide Boeing with cloud computing services is finished, and the winner is a three-way split. Amazon Web Services, Google Cloud and Microsoft are all getting a share of the business, Boeing announced today.

In a LinkedIn post, Susan Doniz, Boeings chief information officer and senior VP for information technology and data analytics, called it a multi-cloud partnership.

This represents a significant investment in the digital tools that will empower Boeings next 100 years, she wrote. These partnerships strengthen our ability to test a system or an aircraft hundreds of times using digital twin technology before it is deployed.

Doniz said that becoming more cloud-centric will provide Boeing with global scalability and elasticity without having to predict, procure, maintain and pay for on-premises servers.

Financial details relating to the multi-cloud partnership were not disclosed.

Historically, most of Boeings applications have been hosted and maintained through on-site servers that are managed by Boeing or external partners. You could argue Boeings extensive intranet blazed a trail for todays cloud computing services.

Marketing and performing computer services involves a whole new way of doing business, Boeing President T.A. Wilson declared in 1970 when Boeing Computer Services was formed.

In recent years, Boeing has been transitioning from its own aging computer infrastructure to cloud providers. For example, in 2016 the company chose Microsoft Azure to handle a significant proportion of its data analytics applications for commercial aviation.

At the time, that was considered a notable win for Microsoft, but Boeing also has maintained relationships with AWS, Google and other cloud providers.

Some had expected Boeing to pick a primary provider as a result of the just-concluded bidding process. Last year, The Information quoted its sources as saying that the deal could be worth at least $1 billion over the course of several years and that Andy Jassy, who is now Amazons CEO, saw it as a must-win for AWS.

But if Boeing is favoring one member of the cloud troika above the others, its being careful not to tip its hand publicly to such an extent that todays announcement consistently lists the three companies in alphabetical order. (If you happen to know who the big winner is, send us a tip.)

Update for 4 p.m. PT April 6: In an interview with Insider, Amazon Web Services senior vice president of sales and marketing, Matt Garman, discussed Boeings decision on parceling out the contracts for cloud computing services and said just about what youd expect an executive in his position to say.

Theyre announcing that theyre going to have a couple of different partnerships, Garman said. I think the vast majority of that will land with AWS.

Cloud services arent the only connection between Amazon and Boeing: In its news release about the Boeing cloud deal, Amazon notes that it has more than 110 Boeing aircraft in its Amazon Air delivery fleet.

The other two cloud titans are also talking: Microsoft noted that its been working with Boeing for more than two decades, and that todays deal will deepen the relationship. Meanwhile, Google emphasized its efforts to match 100% of the energy powering its cloud workloads with renewable energy, making it the cleanest cloud in the industry.

Read the original:
Amazon, Google or Microsoft? Boeing chooses all of the above for cloud computing services - GeekWire

Read More..

The key is the cloud: How to keep up with the speed of innovation – CIO

At DISH Network, cloud-adoption strategies vary by when the parts of its business started from those born in the cloud to legacy sectors deploying cloud on an opportunistic basis. But one thing is clear to Atilla Tinic, the companys EVP and CIO: I do think the key is the cloud. He added: The strategy around cloud is not ROI on a case-by-case basis. Its a must if a company wants to stay relevant.

Tinic is among the speakers at CIOs Future of Cloud Summit, taking place virtually April 12-13. Focusing on speed, scale and software innovation, the event will gather technology executives to discuss both strategy and concrete implementation tactics.

The program begins April 12 and will cover aspects of cloud innovation and agility, with Jessica Groopman, founding partner at Kaleido Insights, kicking off the day with a look at three macrotrends reshaping cloud and software innovation. She will also field questions in a live discussion.

Throughout the day, CIOs from leading companies will dive into aspects of their strategy. Christopher Marsh-Bourdon, head of hybrid environments at Wells Fargo Bank N.A., will offer insights on designing hybrid cloud environments for security and flexibility. Shamim Mohammad, EVP and chief information and technology officer at CarMax, will present a case study on how the cloud enables the companys signature Instant Offer feature. Addressing how to maximize the value of every dollar spent in cloud will be Jennifer Hays, SVP of engineering efficiency and assurance at Fidelity Investments, along with FinOps Foundation Executive Director J.R. Storment.

Michael Riecica, director of security strategy and risk in Rockwell Automations Chief Information Security Office, will drill into the security aspects of cloud strategy. And hear how the U.S. Federal Reserve System leverages cloud smart strategies from System CIO Ghada Ijam.

James Cham, a partner at Bloomberg Beta, will offer a venture fund perspective on changes to watch in software development, deriving value from big data, and a view into where AI fits in. Cham will also lead a live discussion on cloud, and other technology investments, can be used as catalysts for building business value.

Another opportunity for interaction with peers and experts will take place in a workshop on cloud as a business strategy platform led by Kevin L. Jackson, CEO of GlobalNet and the host of Digital Transformers.

Hear from world-class analysts such as Dion Hinchcliffe, vice president and principal analyst at Constellation Research, who will preview what cloud will look like in five years and advise CIOs on how to address the challenges of fast change, while successfully dealing with talent scarcity and technical debt. IDCs Dave McCarthy, research vice president of cloud infrastructure services, will advise on how to navigate a future of digital infrastructure focused on cloud. He will cover application modernization, the best approach for multi-cloud deployments and where to invest in automation. McCarthy will follow up the presentation with a live discussion Wednesday on cloud trends.

Wednesday will focus on shifting into a cloud native future, starting with a tutorial on how to gain and maintain a competitive advantage from champion racecar driver Julia Landauer. Later, she will answer questions about habits and mindsets that drive success.

Priceline Chief Technology Officer Marty Brodbeck will share how the online travel agency sped up its cloud native software production. Meanwhile, Expedia Vice President of Development and Runtime Platform Robert Duffy will discuss how to become a results-driven cloud native organization with Cloud Native Computing Foundation Chief Technology Officer Chris Aniszczyk.

Looking to integrate cloud native apps into a seamless operational platform? Professional Case Management CIO Charlie Billings will share his organizations experience. In another session, Joseph Sieczkowski, CIO for architecture and engineering at BNY Mellon, will discuss cultivating an agile and dynamic operating model.

Finally, in a glimpse at whats to come, learn the hottest cloud-native software development trends from InfoWorld senior writer Serdar Yegulalp and Group Editor for UK B2B Scott Carey. In addition, Yegulalp will present a non-technical introduction to Kubernetes and best practices of managing container-based applications at scale.

Throughout the summit, sponsors including Cloudera, Freshworks and others will share innovative solutions for building your cloud strategy.

Check out the full summit agenda here. The event is free to attend for qualified attendees. Dont miss out register today.

Pictured above (left to right): Ghada Ijam, System CIO, Federal Reserve System; Atilla Tinic, EVP, Chief Information Officer, DISH Network; racecar driver Julia Landauer.

Excerpt from:
The key is the cloud: How to keep up with the speed of innovation - CIO

Read More..

Cloud or Mainframe? The Answer is Both – IBM Newsroom

Cloud or Mainframe? The Answer is Both

By John Granger | Senior Vice President of IBM Consulting

April 06, 2022

To respond to the ongoing pressures of the global pandemic, businesses around the world have turbo-charged their digital transformations. Everywhere you look, companies face an acute need for speed to market, flexibility, nimbleness and, of course, ongoing innovation.

These priorities are why companies are looking to take advantage of cloud computing. But it is not straightforward; it's not just the hop to public cloud. Clients have issues of security and data gravity, of complex systems that are expensive to migrate. Strategically, they have concerns about optionality, about lock in, about discovering that their cloud providers have just become their competitors. These realities explain why so few clients have made a wholesale move to cloud.

The unique needs each company faces in their business transformation journey require a diverse mix of applications and environments including traditional data centers, edge computing and SaaS. What is the role of the mainframe in todays IT infrastructure?

According to a recent IBM study*, the vast majority (a whopping 71%) of IT executives surveyed from major corporations across seven industries say critical mainframe-based applications not only have a place in their IT platforms today but are central to their business strategy. And in three years, the percentage of organizations leveraging mainframe assets in a hybrid cloud environment is expected to increase by more than two-fold. Four of five executives say their organizations need to rapidly transform to keep up with competition, which includes modernizing mainframe-based apps and adopting a more open approach to cloud migration.

A hybrid cloud approach that includes and integrates mainframe computing can drive up to five times the value of a public cloud platform alone and the main sources of value are in five categories: increased business acceleration, developer productivity, infrastructure efficiency, risk and compliance management, and long-term flexibility. With the billions of dollars our clients have invested in business-critical mainframe applications like financial management, customer data and transaction processing over the years, this strategy holds true for IBMs global consulting practice. Our clients primary goal is to modernize those existing investments and minimize risk while delivering hybrid cloud innovation.

Digital transformation is not an either-or process. We guide our clients on the application modernization journey with these key recommendations:

First, adopt an iterative approach. Many enterprises are experiencing firsthand the complexity of their IT estates. Continuing to add to the existing vertical cloud silos is undercutting their flexibility by making processes related to development, operations, and security even more fragmented than before and cloud fragmentation makes it virtually impossible to achieve the standardization and scale that cloud promises to deliver. Therefore, part of your plan to integrate new and existing environments must factor in your industry and workload attributes to co-create a business case and road map designed to meet your strategic goals. Adopt an incremental and adaptive approach to modernization as compared to a big bang. Leverage techniques such as coexistence architecture to gradually make the transition to the integrated hybrid architecture.

Then, assess your portfolio and build your roadmap. To understand your desired future state, first assess your current state. Examine the capabilities that define the role of the mainframe in your enterprise today and how those capabilities tie into the greater hybrid cloud technology ecosystem. In addition, take stock of your existing talent and resources and determine any potential gaps. For IBMs consulting business, the partnership and role that IBM Systems plays is fundamental for the simple reason that solutions such as the new IBM z16 perform many of the critical functions underpinning a truly open and secure hybrid cloud environment. These functions include accessing troves of unstructured on-premises data across a hybrid cloud platform, scaling and automating data-driven insights with AI, and being agile enough to process critical apps and data in real-time all while assessing security risks. Storing data across multiple clouds and moving it between partners and third parties can leave companies more vulnerable to security issues such as data breaches. Assessing infrastructure solutions that support the ability to protect data even when it leaves your platform is crucial.

Finally, leverage multiple modernization strategies and enable easy access to existing mainframe applications and data by using APIs. This means providing a common developer experience by integrating open-source tools and a streamlined process for agility, in addition to developing cloud native applications on the mainframe and containerizing those applications.

IT executives expect significant usage increases in both mainframe (35%) and cloud-based applications (44%) over the next two years.

Consider how you can extract more value from both your mainframe and cloud investments. Blending mainframe power, reliability and security into the cloud landscape is essential to achieve the enterprise-wide agility and capability required to keep pace with ever-changing business needs.

* Study Methodology: Oxford Economics and the IBM Institute for Business Value surveyed 200 IT executives (for example, CIOs, Chief Enterprise architects) in North America.

Read the original post:
Cloud or Mainframe? The Answer is Both - IBM Newsroom

Read More..

Will cloud computing be Canadas next big military procurement? Heres what to know – Global News

Ask most Canadians what the military needs next, and cloud computing might not be the first thing that jumps to mind.

But modernizing how Canadian security officials manage increasingly massive troves of data could be among the most important decisions of the coming years and federal officials have confirmed to Global News that preliminary work is underway.

Militaries are reflective of the societies they live in and a lot of the sort of development of how were going to fight wars in the future is stuff that we see in society today, which is large amounts of data management, said Richard Shimooka, a senior fellow at the Macdonald-Laurier Institute.

Its taking huge amounts of information and organizing and storing it away, and then actually applying them to conduct operations.

Story continues below advertisement

Canadian national security agencies and the military sit atop hordes of data that need to be continually tracked, assessed and managed in order to support the operations carried out to protect the countrys interests.

Increasingly though, those reams of data arent being stored just in filing cabinets or basements or bunkers. They sit in the cloud the digital ether that most Canadians likely know best as the safe haven for backing up old family photos or for syncing information between multiple devices.

As the amorphous nature of cyber warfare and cyber conflict have demonstrated over recent years, being able to gather, interpret, share and act on digital information is already a critical part of how militaries and national security agencies do their jobs in the 21st century.

Yet modernization has been a slow march for Canadian security actors, including the Canadian Forces.

Some of our systems and processes are dating back to the 50s. So [there is] crazy potential to upgrade that with not even modern practices, but to catch up to the 2010s, said Dave Perry, vice president of the Canadian Global Affairs Institute and an expert in Canadian defence policy.

Story continues below advertisement

It was a massive accomplishment to start using [Microsoft] Office 365 in recent years.

U.S. military cloud contracts are worth billions

Speculation about whether Canada could look toward a cloud computing contract comes amid plans south of the border to award a multibillion-dollar contract later this year for the Department of Defense.

Last summer, the U.S. Defense Department announced plans to award a contract in April 2022 for what it now calls the Joint Warfighting Cloud Capability.

That initiative aims to bring multiple American IT providers into a contract to provide cloud computing services for the military, and it replaces a single vendor program planned under the former Trump administration that was known as JEDI the Joint Enterprise Defense Infrastructure project.

Trending Stories

Story continues below advertisement

Last month, the Pentagon announced the JWCC contract wont be awarded until December 2022.

Microsoft and Amazon are believed to be frontrunners for different parts of that deal, while Google, Oracle and IBM have also expressed interest.

Some of those firms are now also lobbying Canadian officials to get similar contracts in place here.

Which firms are lobbying Canadian officials?

Google, IBM, Oracle and Microsoft did not have any lobbying listings with national security officials in recent months, although all list cloud computing as among their broader lobbying interests with officials with other departments including Treasury Board Secretariat, Justice Canada, and Natural Resources.

Amazon Web Services does have recent records filed disclosing lobbying with national security agencies and officials, one of its listed interests being seeking contracts with multiple government departments and institutions with regards to Amazon Cloud based solutions and related support services.

Story continues below advertisement

The web giant also has job postings up for working on its push to get cloud computing into Canadian government departments, including an account manager. That role is tasked with increasing adoption of Amazon Web Services by developing strategic accounts within Canadas Federal Government National Security sector.

According to lobbyist filings, Eric Gales, president of the Canadian branch, had meetings with Michael Power, chief of staff to Defence Minister Anita Anand, on Feb. 19, 2022, and one day earlier had met with the acting assistant deputy minister of Shared Services Canada, Scott Davis.

He also metwith Sami Khoury, head of the Canadian Centre for Cyber Security, on Nov. 17, 2021.

The Canadian Centre for Cyber Security is part of the Communications Security Establishment, Canadas signals intelligence agency and the body tasked with protecting the Government of Canadas IT networks.

A spokesperson for the CSE confirmed early work on the matter is underway,

Story continues below advertisement

The evolving information technology (IT) world is moving to cloud-based services. We are aware that our closest allies have, or are acquiring classified cloud capabilities, and we continue to engage in conversations with them on security requirements to maintain interoperability, Evan Koronewski said.

The Government of Canadas security and intelligence community is engaged in preliminary research, exploring the requirements for classified cloud services.

He added officials are exploring security requirements with the Treasury Board Secretariat, Shared Services Canada, and the Department of National Defence.

A spokesperson for the latter also confirmed that the military is working on incorporating more cloud capabilities, though not yet for classified material.

We recognize that cloud computing offers key benefits in terms of IT efficiency, said Dan Le Bouthillier.

DND/CAF is building its cloud capacity and has adopted a Multi-cloud Strategy with multiple vendors, namely Microsoft, Amazon Web Services, and Google.

He added the goal is to strike the right balance between agility and security.

The website for Shared Services Canada, which handles IT services for government departments, states there are framework agreements for cloud computing in place with eight providers: Google Cloud, ServiceNow, IBM Cloud, Oracle, ThinkOn, Microsoft and Amazon Web Services.

Those will let departments contract cloud services as they need through those providers.

Story continues below advertisement

The U.S. military cloud computing contract is valued at US$9 billion, or $11.2 billion.

Its not clear how much a similar solution for national security agencies here could cost.

Both Prime Minister Justin Trudeau and Defence Minister Anita Anand have suggested in recent weeks that the government is weighing an increase to defence spending, moving it closer to the NATO target, which aims to see all members of the military alliance spend at least two per cent of GDP on defence.

Canadas current defence spending sits at 1.39 per cent of GDP.

To hit the two per cent target would require approximately $16 billion.

That would be above the increases currently projected under the governments 2017 plan to boost defence spending, which will see it rise to $32.7 billion by 2026/27 from $18.9 billion in 2016/17.

Originally posted here:
Will cloud computing be Canadas next big military procurement? Heres what to know - Global News

Read More..

What Would Happen to Cloud Computing Careers in Next 5 Years? Wil They Perish? – Analytics Insight

Let us see if cloud computing careers have a chance or will die in the next 5-years?

Cloud computing is undoubtedly the hottest field for any technological worker right now. The cloud has become a popular buzzword among both large and small businesses. The adoption of the cloud by the rest of the globe has resulted in a tremendous increase in cloud-related jobs.

Cloud Computing allows us to link anything digitally nowadays. It opens up a whole new universe of opportunities in terms of jobs, applications, services, and platforms. Cloud computings future can be seen as a mix of cloud-based software and on-premises computing, which will aid in the creation of hybrid IT solutions.

Because there is so much competitiveness among cloud providers, more data centres will be available at a reduced cost. We can save data in the cloud with the use of IoT (Internet of Things) and Cloud Computing for further analysis and improved performance. The network offered will be speedier, and data will be received and delivered more quickly.

One can reach their respective objectives with the help of this service. Many studies have shown that Cloud Computing will be one of the most important technologies in the future, with software as a service solution accounting for more than 60% of all workload.

Cloud has a number of advantages that make its future in the IT industry more promising. The redesigned cloud is scalable and versatile, allowing for data center security and control. An organized procedure and a better technique for processing data will be important aspects of cloud computing.

With a staggering 1828 percent rise in employment in the last year, cloud computing and management services are the fastest growing employer in the software business. Second place went to IT Services and Consulting, which saw a relatively tiny 355 percent rise in personnel.

When you realize that these data only include 5 firms whose core business is cloud computing, not positions within corporations, the true number of cloud jobs is substantially larger.

Software Engineer, Java Developer, Systems Engineer, Network Engineer, Systems Administrator, and Enterprise Architect are some of the most common cloud occupations. However, you dont have to be an engineer to be a part of this rapidly growing job market; other roles requiring cloud knowledge, such as marketing managers or market and research analysts, are in high demand.

The most common prerequisite for technical cloud employment is cloud computing. Other common criteria include Oracle Java, Linux, Structured Query Language (SQL), UNIX, Software as a Service (SaaS), Python Extensible Programming Language, and the like.

With concerns about cloud security dissipation, an increasing number of businesses are preparing to migrate to the cloud. According to a recent report, the number of jobs created by cloud computing would rise from 6.7 million to 13.8 million by 2013. Creating a job growth of 108 percent in the United Kingdom alone.

There has never been a better moment to be a cloud expert, with more cloud positions than cloud technicians to fill them. It is getting increasingly difficult for firms to find the talent they require. To ensure that you obtain the best personnel, you must know where to look and how to time your jobs efficiently.

So definitely there can be seen a service performance gap in this specific career field, due to the higher expected cloud computing skills of the organizations and the young cloud personnel or cloud technicians. But it can be assuredly said that within the next 5 years, Cloud Computing careers will only flourish and certainly not die.

Cloud computing is strong and expansive, and it will continue to grow and give many benefits in the future since it is incredibly cost-effective, and businesses can use it to grow. Thus Cloud computing careers have a bright future ahead, with benefits for both the host and the customer. Although its important to remember that the companys owner should be up to date on the latest developments in cloud technology.

Share This ArticleDo the sharing thingy

Follow this link:
What Would Happen to Cloud Computing Careers in Next 5 Years? Wil They Perish? - Analytics Insight

Read More..

The future of work in the heterogeneous diverse cloud – BCS

Today, we have seen the proliferation, popularisation and eventual propagation of cloud computing, mobile device ubiquity and new algorithmically-enriched approaches to Artificial Intelligence (AI) and Machine Learning (ML)... all of which have further changed the nature of work.

The sum consequence of much of the development on the post-millennial technology curve is a new approach to digitally-driven work. To explain this shuddering generalisation, digital work means tasks, processes, procedures and higher-level workflows that can be encoded into data in order for their status to be tracked, analysed and managed.

Part of the total estate of big data that now hovers over all digital assets in the modern workplace, digital workflows can now be built that are more intelligently shared between humans and machines.

Where processes are accurately definable, typically repeatable and easily replicable, we now have the opportunity to use autonomous software controls such as Robotic Process Automation (RPA) and chatbots to shoulder part of our daily tasks. Although there is a period of process mining and process discovery that we need to perform before we can switch on the autonomous advantage, once we do so we can start to focus human skills on more creative higher-value tasks.

Where all of this gets us is to a point where we can be intelligently granular about how we place elements of our total digital workload across data services, across application resources, across cloud backbones and ultimately, across people.

To enable digital work, we still have some challenges to overcome i.e. we need to be able to communicate between each other as humans and machines in a consistent yet essentially decoupled way. Because not every work task has had its genome decrypted, we are still searching for ways to encapsulate certain aspects of enterprise workflows.

This is tough because were aiming towards a moving target i.e. market swings and the dynamism of global trade. But, as we start to build new work systems, we can start to operate workflows that are intelligently shared across different interconnected cloud services, for a variety of core reasons.

Enterprises can now create a layered fabric of work elements and functions shared across different Cloud Services Providers (CSPs), sometimes separated-out on the basis of different cloud contract costs, sometimes for reasons related to geographic latency or regulatory compliance, or often dispersed across more than one cloud due to the various optimisation functions (processing, storage, transactional Input/Output capability, GPU accelerated etc.) that exist in different services.

If private on-premises cloud combined with public cloud is what we now understand to be the de facto most sensible approach we know as hybrid cloud, then this (above) deployment scenario is one move wider. Where workloads are placed across clouds, we are in hybrid territory; but where individual data workflows are dispersed across and between different cloud services, we get to poly-cloud.

The architectural complexity of interconnected cloud services that are established around these terms is not hard to grasp. In order to make this type of lower substrate diversity manageable, cost-effective and above all functional, enterprises will need to embrace a platform-based approach to hyperconverged cloud infrastructure.

Most organisations struggle to effectively manage heterogeneous cloud environments and move workloads back and forth between and among them. Establishing visible benefits from this type of approach to cloud is only possible if the business is able to think of its cloud infrastructure as an invisible foundational layer.

Managing a multi-cloud and poly-cloud infrastructure means being able to simplify cloud management and operations requirements across an enterprises chosen estate of interconnected cloud services. With different providers all offering different software toolsets, different management dashboards, different configuration parameters and so on, there is no point-and-click solution without a hyperconverged higher platform layer in place.

As theoretical as some of the discussion here sounds, many practical examples already exist. South Africas largest bank Nedbank has been bold with its cloud-based approach designed to cope with cost-effectively delivering upon its diverse bandwidth requirements.

Needing low-latency remote worker provision for its 2,000-strong developer function in India (but capable of straddling less performant latency parameters for other functions), the company had to build systems capable of superior service that would be a win-win for staff and customers alike.

Read this article:
The future of work in the heterogeneous diverse cloud - BCS

Read More..

BrainChip, SiFive partner to bring AI and ML to edge computing – VentureBeat

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!

AI processor maker BrainChip, which makes ultra-low-power neuromorphic chips and supporting software, and SiFive, founder of the RISC-V computing genre, today announced they have combined their respective technologies to offer chip designers optimized artificial intelligence (AI) and machine learning (ML) for edge computing

BrainChips AI engine, Akida, is an advanced neural networking processor architecture that brings AI functionality to edge and cloud computing in a way that wasnt previously possible with its high performance and ultra-low power usage, the company said. SiFive Intelligence solutions, with their highly configurable multi-core, multi-cluster capable design, integrate software and hardware to accelerate AI/ML applications, BrainChip CMO Jerome Nadel told VentureBeat.

The integration of BrainChips Akida technology and SiFives multi-core capable RISC-V processors is expected to provide an efficient solution for integrated edge AI computing, Nadel said.

RISC-V (pronounced risk-five) is an open instruction-set computing architecture based on established reduced instruction set computing (RISC) principles. Its an open-source project available to anybody who wants to use it. RISC-V represents a major step forward in data processing speed that is required of all the new and much heavier applications (such as machine learning, AI and high-resolution video) that are coming into daily use. RISC-V appears to be a natural fit for BrainChips architecture for neural networking processors.

SiFive Intelligence-based processors have a highly configurable multi-core, multi-cluster-capable design that has been optimized for a range of applications requiring high-throughput, single-thread performance while under tight power and area constraints, Nadel said.

BrainChips Akida mimics the human brain to analyze only essential sensor inputs at the point of acquisition, processing data with efficiency, precision, and economy of energy, Nadel said. Keeping AI/ML local to the chip and independent of the cloud reduces latency while improving privacy and data security, he said.

BrainChips technology is based on its spiking neuron adaptive processor (SNAP) technology and licenses SNAP with technology partners. SNAP offers a development solution for companies entering the neuromorphic semiconductor chip market. It is a core-enabling technology in neuromorphic semiconductor chips that enables various applications, such as gaming, cybersecurity, robotic technology and stock market forecasting, among others.

As we expand our ecosystem of portfolio partners, we want to be sure that our relationships are built on complementary technologies, enabling capabilities and breadth of environments so that we can expand opportunities to as many potential customers as possible, Nadel said. Driving our technology into a SiFive-based subsystem is exactly the type of partnership that meets these goals.

VentureBeat asked Jack Kang, senior vice president of Business Development, Customer Experience (CX), Corporate Marketing at SiFive, a few specific questions about the news and the relevance of the partnership.

VentureBeat: What is the no. 1 business takeaway from this announcement?

Jack Kang: For SiFive, this announcement shows the ongoing uptake of the SiFive Intelligence family of RISC-V-based processor IP. More companies are choosing RISC-V to be part of their product roadmap strategy, and SiFive is the leading provider of commercial RISC-V IP. In the emerging greenfield markets of AI/ML-enabled platforms, such as the edge processing market targeted by BrainChip, the performance per area and efficiency advantages of SiFive processor architecture make the SiFive Intelligence family a competitive choice.

VentureBeat: Does BrainChip use any of Arms IP in its chips? Arm is known for low power and high performance.

Kang: BrainChip has discussed Arm IP for their product line. Arm processors have built a reputation for low power based on comparisons to x86-based products. SiFive Intelligence products compare well to Arm products through offering improved performance-per-area of up to 30%, combined with a single ISA for simpler programming, and a modular approach that aligns well to working with hardened AI IP such as that developed by BrainChip.

VentureBeat: Can you expand upon this statement: (Brainchip) mimics the human brain to analyze only essential sensor inputs at the point of acquisition.

Kang: This statement refers to the ability of humans to focus on whats important. For example, listening to a conversation in a coffee shop while still registering and acknowledging background sounds. The BrainChip solution will mimic this ability to reduce power and increase efficiency by focusing on the important data being processed. This is similar to, but a step beyond, the adoption of mixed and lower precision data types (INT8 vs. FP16) to speed up and improve the efficiency of AI/ML processing.

Brainchip, based in Aliso Viejo, California, competes in the burgeoning intelligent-edge chip market with Nvidia Deep Learning GPU, Keras, TFLearn, Clarifai and Microsoft Cognitive Toolkit, AWS Deep Learning AMIs and Torch. Nvidia owns about 80 percent of the global GPU (graphics processing unit) market. G2.com has market information here. Availability of the new SiFive/BrainChip solutions will be announced at a later date.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Read the rest here:
BrainChip, SiFive partner to bring AI and ML to edge computing - VentureBeat

Read More..

UCL and AWS partner to launch digital innovation centre – Healthcare IT News

University College London (UCL) and Amazon Web Services (AWS) are joining forces to launch a centre for digital innovation.

The centre, to be hosted at the IDEALondon technology hub, will help healthcare and education organisations to accelerate digital innovation and address global issues in the sectors.

It has sent out its first call for engagement via the Impact Accelerator, a programme offered by UCL and AWS which aims to boost startups by providing advice, education and funding initiatives.

Successful applicants to the UCL Centre for Digital Innovation (CDI)will be supported to build a product prototype or help design it for scalability if one already exists. AWS will also provide AWS credits of up to $500,000 (370,000) per year to help fund development of prototypes and new solutions.

Healthcare and education organisations, research teams, startups and UCLs technology spinouts are eligible to apply.

WHY IT MATTERS

Projects should look to solve a global issue in health or education using cloud computing and have a real user and customer in mind. The aim is to produce evidence-based, commercially sustainable technological innovations.

UCL will draw on expertise from several faculties including medical sciences, IOE, engineering and life sciences, and experts from University College London Hospital (UCLH).

Successful applicants will be given access to training, education and technical support, including a resident AWS solutions architect, domain experts in health tech and ed tech, and immersion days on specialist topics.

THE LARGER CONTEXT

AWS launched its firstaccelerator programmefor UK-based digital healthcare startups in 2021. It recently announced 12 innovatorswho have been selected to take part in the programme.

Last year AWS announced plansto open an infrastructure region in the United Arab Emirates (UAE) in the first half of 2022. The expansion is part of AWS's aims to build on its current 80 availability zones globally across 25 geographic regions.

ON THE RECORD

UCL CDI director Graa Carvalho (UCL Faculty of Engineering Sciences) said: This ambitious collaboration brings together the strength of UCL and AWS to build trust within digital innovation, allowing hospitals, universities, patients, students, research teams and UCL spinouts to use cloud-based technology to compete on a global stage.

John Davies, director, regional government, UK, worldwide public sector at AWS, said: By bringing together UCLs world renowned academic rigour with AWS cloud technologies and culture of innovation, we hope to provide healthcare and education organisations with a springboard to help them to address some of the toughest challenges facing society right now.

UCL pro-vice-provost (AI), Professor Geraint Rees, said: Innovative digital solutions to the worlds problems are best created in collaboration between academic and commercial organisations. The UCL CDI, powered by AWS, combines the best of both domains. We believe that this combined endeavour will lead us to solutions that are evidence based, commercially sustainable and focus on the needs of the worlds citizens.

Originally posted here:
UCL and AWS partner to launch digital innovation centre - Healthcare IT News

Read More..