Category Archives: Cloud Servers

Zoom 5 0 update released with several security improvements amidst public backlash – Digit

Zoom has released a 5.0 update to its popular video conferencing service with many security and privacy improvements that give the company some breathing space amidst public backlash. The Indian government even issued an advisory that states Zoom isnt a safe platform for official communication and hence restricted its employees from using the app for official work.

Zooms latest update targets this outrage with reinforced security features like the support for AES 256-bit encryption, the ability to report a user and more. In a statement to the press, Oded Gal, CPO of Zoom said, From our network to our feature set to our user experience, everything is being put through rigorous scrutiny. On the back end, AES 256-bit GCM encryption will raise the bar for securing our users data in transit. On the front end, Im most excited about the Security icon in the meeting menu bar. This takes our security features, existing and new, and puts them front and centre for our meeting hosts. With millions of new users, this will make sure they have instant access to important security controls in their meetings.

Zooms latest update brings more robust network support with new control options and a focus on improving the user experience. The update brings support for AES 256-bit GCM encryption standard that strengthens the security of data transmitted between calls making it secure from unauthorized access.

This new encryption technology is being rolled out to Zoom Meeting, Webinar and Phone data will be activated on May 30, after all accounts have been seeded the 5.0 update. Moreover, administrators can now choose to control which data centre is being used to route traffic to your Zoom meeting.

The new update also adds a dedicated security menu button in the video calling interface that groups together all the necessary security options. Zoom meeting hosts will also be able to report a user from Security settings and restrict participants from changing their log-in names.

Zoom users who own a Basic, education or a Pro account will now notice that the waiting room is enabled by default for hosts to keep participants ready to join the meeting. The update also activates meeting passwords by default and hosts can even modify the password complexity.

Call recordings stored on cloud servers can be accessed by hosts without any password but everyone else would require a password to access those recordings.

Zoom 5.0 also brings in support for organizations to link contacts across multiple accounts for people to easily search for meetings and chats.

Read the original here:
Zoom 5 0 update released with several security improvements amidst public backlash - Digit

Intel: The Growth Story Continues – Seeking Alpha

Source: Intel

Intel Corporation (INTC) is the worlds largest manufacturer of semiconductor products. Despite its size, it has achieved constant revenue growth with a five-year average growth rate of 5.57%, and accordingly its stock price near doubled in that same period. However, the slowdown of the PC market poses a risk to the companys continued growth as most of its revenue is derived from its CPUs. Fortunately, the company had foreseen this and accordingly expanded into the data centre space, which is the companys current focus. Looking further ahead, Intel has already laid the groundwork with several high growth segments to ensure its future growth will be fulfilled.

While many companies seek to diversify their revenue for additional growth, few have the scale and resources that Intel do. Additionally, we found reasonable indication that the company could rapidly scale in these newer businesses. We had to look no further than Intel itself, looking at its rapid expansion into the second largest segment, Data Center Group (DSG). The three main success factors identified were:

Intel had used these strengths to their advantage when diversifying to DCG and we expect it to take full advantage of them to expand into its newer segments.

Revenue Segment

$ mln

% Revenue

Growth Driver

Client Computing Group (CCG)

37,146

51.39%

Past

Data Center Group (DCG)

23,481

32.49%

Present

Non-Volatile Memory Solutions Group (NVMG)

4,362

6.04%

Present/Future

Intel Security Group (ISG)

313

0.43%

Present/Future

Internet of Things Group (IoT)

3,821

5.29%

Future

Programmable Solutions Group (PSG)

1,987

2.75%

Future

Automated Driving Group (ADG)

879

1.22%

Future

All Other

289

0.40%

N/A

Source: Intel

Intels business segments can be viewed as a representation of three different timelines of growth. In the table above, we labelled the time period in which these segments serve as the main growth driver to Intel. Before getting into Intels newer high growth segments, we explain why Intels Client Computing Group (CCG) has seen its best days gone past.

Intels largest segment, the Client Computing Group (CCG), is a representation of Intels rich history in CPUs spanning back to 1968. As the inventor of the x86 series of CPUs, it holds the worlds largest market share of CPUs. This segment undoubtedly made Intel into the giant it is today. The PC-centric CCG accounts for revenue from mostly notebooks and desktops. As mentioned, personal computer unit sales have been on a decline which does not bode well for Intel. Its three largest customers; Dell Technologies Inc. (DELL) (16% of revenue), Lenovo (OTCPK:LNVGY) (12%) and Hewlett Packard (HPE) (11%) also happen to be the three largest PC-makers in the world by market share, as seen below.

Source: Statista

With PC unit sales decreasing by an average of 4.09% annually since 2012, we expect this trend to further continue as PCs' lifespans continue to increase, with current life spans reaching 5-8 years. This is supported by PC vendors that are becoming increasingly competitive by offering better after-sales service such as maintenance, repair and upgrade services. According to data from Statista, the decline in terms of unit sales for Intel has been occurring since the end of 2016, with Intel not only losing unit sales due to the overall PC market but also due to loss of market share to Advanced Micro Devices (AMD).

Over the last few years, AMD has seen a strong recovery in the PC market, with the launch of Ryzen and Epyc CPUs. AMDs CPUs not only offer better multi-core performances, but sell for much lower prices as identified:

Intel CPU Average Selling Price

AMD CPU Average Selling Price

$735.65

$322.20

Source: CPU World

Besides AMD, new ARM-based processors by Arm Holdings (owned by SoftBank (OTCPK:SFTBY)) have proven to be worthy substitutes with some of Intels top customers, including Lenovo and HP having released ARM-based laptops. Surprisingly, Intel has managed to maintain a slight revenue growth by increasing its average selling prices. However, as Intels CPU prices already significantly outprice AMDs, we do not see a sustainable way for Intel to increase its revenue for this segment.

In the last earnings call, Intel CFO George Davis guided for low single-digit growth for its PC-centric segment. However, we feel this is overly optimistic. While the COVID-19 pandemic has created temporary tailwinds for PCs as people are mandated to work from home, this will normalize towards the end of the year. Additionally, consumers become much more price sensitive in recessions, and AMDs more affordable processors provide much better alternatives. Considering this, even after accounting for an increase in average selling prices for Intels CPUs, and a temporary increase in unit sales for this quarter, our calculations show revenue for CCG will decrease in the long term.

Around the same time that PC sales started to decline, Intel began rapidly scaling in the data centre space by supporting its core PC customers and other cloud players in providing chips for the computing power requirements of their data centers. This brings us to the present.

While Intel has exposure to millions of end-consumers through its PC customers, Intel only has a few direct customers itself. The nature of the semiconductor industry means chipmakers like Intel control most of the chip supply to large OEMS (smartphone manufacturers, PC manufactures, automakers, data centre companies and e-commerce platforms). This explains how just 3 customers (PC manufactures) make up 39% of Intels $71.9 billion revenue.

This also means Intel can very quickly gain access to large buyers. Intel did just that by leveraging its existing CCG customers such as Lenovo and Dell to rapidly grow in the DCG space over the past 5 years, as seen below.

Source: Intel

Intel continued to build on these existing customers by offering newer products from its portfolio. For example, Intel recently announced a multiyear agreement with Lenovo to provide its server chips for High Performance Computing and AI workloads. By building onto its product portfolio, it gained newer customers in the cloud space such as Amazon (AMZN), Microsoft (MSFT), Alibaba (BABA) and Google (GOOG) (NASDAQ:GOOGL). Microsoft utilizes Intels Xeon Scalable processors in its Azure cloud servers, while Chinese e-commerce giant Alibaba deploys Intels processor and memory technology for its e-commerce website. Building on this, Intel is collaborating with Alibaba for the 2021 Olympics in Japan where Intel will power 3D Athlete Tracking Technology.

While Intel serves large players in this space, it does face some competition from AMD which has also secured some heavy hitters of its own, with orders from major players such as Dell, IBM (IBM) and Nokia (NOK). As such, while this segment remains Intels present focus, the company has taken steps to establish newer business segments through product expansion and acquisitions to secure its future growth.

We can see Intel doing well in its future segments, as it is already replicating the key drivers that made the DCG expansion successful: 1) leveraging existing customer relationships, 2) securing newer customers through its portfolio of products, and 3) riding the high growth of the markets.

Intel has built its broad portfolio of products outside of CPUs, to include GPUs, FPGAs and ASICs. This provides the opportunity for Intel to cross-sell its products to both existing and newer customers. Intel has already proved this by securing contracts with existing cloud service customers including Google, Oracle (ORCL), Cisco (CSCO), Dell and Lenovo for its Non-Volatile Memory Solutions Group (NSG) segment. Dell, which is Intels CCG and DCG customer, utilizes Intels 3D NAND Optane SSDs and its Xeon server processors making it a customer for yet another business segment. Intel has proven it can gain market share in new markets easily, despite the memory market being highly competitive with large memory players such as Samsung (OTC:SSNLF), Micron (MU) and SK Hynix (OTC:HXSCF).

Intel also grew its product portfolio inorganically through acquisitions. It acquired McAfee in 2011 which now forms its Security Group (ISG) segment, and recently it acquired Altera, specialised in field programmable field array (FPGA) to operate under its Programmable Solutions Group (PSG) segment and Mobileye, a developer of advanced driver assistance systems for autonomous vehicles to operate under its high-growth Automated Driving Group (ADG) segment. The main competitors to its ISG segment are Symantec (acquired by Broadcom (AVGO)), ESET, Bitdefender, AVAST Software, and Kaspersky Lab. Despite the competition, Intel had the advantage of a wide customer base of large enterprises to cross-sell McAfees security solutions. As PCs and anti-virus software go hand-in-hand, it was a no-brainer that Dell and HP would sell their PCs and laptops with McAfee software pre-installed. And of course, Intel takes the opportunity to also extend McAfee cybersecurity solutions to its DCG customers, Google and Microsoft.

Building on this, Intel managed to secure Microsoft into utilizing its Stratix FPGAs in Microsofts Azure cloud platform, configured to run deep learning models. However, Intel failed to rope all of its customers into FPGAs, as it faces strong competition from Xilinx, the market leader of FPGAs with about 65% market share. Xilinx enjoys a technological lead over Intel as its FPGAs are based on 7nm manufacturing process while Alteras are based on the 10nm and 14nm. Due to Xilinxs lead, it has managed to also secure cloud vendors Alibaba and Amazon (Intels DCG customers) to use its FPGAs.

While there is competition in the Autonomous Driving space from Nvidia (NVDA) and NXP (NXPI), ADG has been Intels fastest growing segment, increasing 26% Y/Y in 2019. Underlying this strong growth is its partnerships with automakers around the world. Intel recently announced a partnership with Chinese auto manufacturer SAIC. The addition of SAIC to its existing partnership with Nio will strengthen Intels foothold in the Chinese auto market.

The Internet of Things Group (IoTG) exemplifies Intels broad range of capabilities. For instance, Intel recently announced a partnership with The Sinclair, Autograph Collection owned by Marriott International (MAR), to build the worlds first digital hotel, by providing in-room sensors, Wi-Fi cloud networking solutions, PoE-powered and LED mirrors. There are more than 140 Autograph Collection properties globally. Should the partnership extend, we estimated a revenue of at least $45.9 million based on a total of 164 rooms per hotel with chip revenue of $2,000/room for Intel. While this is not significant, opportunities such as these were not previously thought possible. With the world becoming increasingly interconnected, the Internet of Things (IoT) has opened up a realm of possibilities and we see companies like Intel who have the capability to provide end-to-end solutions, being able to capitalize on these opportunities.

A potential opportunity we see for Intel is through continued support for Microsofts Azure platform with its new technologies in AI, machine learning and edge computing workloads. Intel may have also already secured a huge partner for its new GPUs, as it hinted that it may supply its upcoming Xe discrete GPU accelerators to Lenovo. Additionally, should Intel jump up a manufacturing node for FPGA, it wouldnt be surprising to see Intel secure some of Xilinxs customers (which have longer established relationships with Intel). As exemplified, Intel has the capability to rapidly expand its newer segments through partnerships with both existing and new customers.

Lastly, while these newer segments currently only make up 16.12% of revenue, they operate in very high growth markets as seen below.

*4-year CAGR

**3-year CAGR

***1-year CAGR

Our model projected revenue growth based on overall market growths and industry competitiveness. It has also been seasonally adjusted while also taking into account the negative impacts of the coronavirus for the first half of FY2020. They further reflect the potential revenue opportunities that potentially exist in each segment based on emerging technologies and the presence of Intels customers overlapping in said markets.

Source: Intel, Khaveen Investments

Year

2014

2015

2016

2017

2018

2019

2020E*

2021E*

Revenue

56,574

55,329

59,054

63,572

71,993

73,188

83,718

83,023

Y/Y %

N/A

-2.2%

6.7%

7.7%

13.2%

1.7%

14.4%

See the original post:
Intel: The Growth Story Continues - Seeking Alpha

Integrating AI & ML in cloud services for healthcare: The benefits and risks – ETCIO.com

By Shreekanth Joshi

Artificial intelligence (AI) and machine learning (ML) have seen exponential growth in their ability to consume large amounts of data and produce accurate insights that approach human level accuracy. This has largely been possible due to the availability of cloud-based resources that are scalable, more cost effective and readily available.

In the healthcare industry, the analytics of health-related data is improving care from super specialized tertiary care centers to secondary and primary care. Telemedicine is making these insights available at the point of care, leading to better and more specialized diagnosis. Both developments make more reliable care available in real time across the last mile and help bridge the gap between large numbers of patients and a limited number of healthcare providers.

Access to care is a major issue in the case of geographically distributed populations. But digital tools and resources can be provisioned on the cloud and made available over the last mile to these areas with data network coverage. This allows primary centers to diagnose and collect digital samples and send them for analysis to tertiary centers.

Machine learning models can be made more robust and accurate using cloud infrastructure. The flexible resourcing available in the cloud can track more last mile data from devices, wearables and health trackers, then stream and aggregate it cheaply in cloud-based storage. The heavy-duty analysis of this large amount of data can be done efficiently using cloud-based compute infrastructure. This in turn allows the ML models to be trained more effectively and their accuracy improves over time.

The large amount of data available for training makes ML models scale even better. For several tasks in image analysis, for example, the model accuracy is already reaching human level. ML models can be made more personalized to start generating recommendations that are very specific to individual patients.

Regulatory Considerations

All this computation comes at a regulatory cost. Data must be secured at rest and in motion, and be anonymize before feeding into the ML models and recommendations must be re-identified to make them specific to a particular patient. This involves resources from not one but multiple cloud providers working in a hybrid manner.

The National Digital Health Blueprint calls for strict adherence to regulations with respect to privacy and protection of patient data. As a result, advanced technical check points need to be implemented to prevent patient data from being accidentally disclosed to unintended recipients.

Its also necessary to enforce consent-related policies that allow patient data to be used only by healthcare professionals with consent and for a specified duration.

This requires a heavy emphasis on securing cloud environments and enforcing controls for data access, processing and the dissemination of insights.

ML models also ingest significant amounts of personal data from each patient device, such as the health tracker on a mobile phone or wearables like Fitbit or sleep monitors, insulin monitors and even blood pressure monitors. All these devices must be integrated with the cloud resources to enable end to end data processing.

Given stringent regulation and privacy requirements, data privacy and access must be monitored and governed. This calls for the use of technologies for data and cloud management. IT teams must have comprehensive management frameworks that can integrate personal and corporate devices and implement the necessary safeguards.

Modern unified management frameworks provide a way to make this problem more manageable and can implement the necessary governance best practices to manage the consumer devices, mobile apps and back-end cloud platforms.

The author is Vice President, Engineering, Persistent Systems

Visit link:
Integrating AI & ML in cloud services for healthcare: The benefits and risks - ETCIO.com

Week in review: Cloud migration and cybersecurity, data trending on the dark web, Zoom security – Help Net Security

Heres an overview of some of last weeks most interesting news and articles:

What type of data is trending on the dark web?Fraud guides accounted for nearly half (49%) of the data being sold on the dark web, followed by personal data at 15.6%, according to Terbium Labs.

Cybersecurity in a remote workplace: A joint effortWith so many employees now working from home, business networks have been opened to countless untrusted networks and potentially some unsanctioned devices. Naturally, the question of security arises given the need to ensure that employees are well prepared for the challenges associated with remote work. It also means that businesses must be certain that their security infrastructure is well geared to secure personal and corporate data.

Will Zoom manage to retain security-conscious customers?While Zoom Video Communications is trying to change the publics rightful perception that, at least until a few weeks ago, Zoom security and privacy were low on their list of priorities, some users are already abandoning the ship.

GDPR, CCPA and beyond: How synthetic data can reduce the scope of stringent regulationsAs many organizations are still discovering, compliance is complicated. Stringent regulations, like the GDPR and the CCPA, require multiple steps from numerous departments within an enterprise in order to achieve and maintain compliance.

April 2020 Patch Tuesday: Microsoft fixes three actively exploited vulnerabilitiesFor the April 2020 Patch Tuesday, Adobe plugs 5 flaws and Microsoft 113, three of which are currently being exploited by attackers.

VMware plugs critical flaw in vCenter Server, patch ASAP!VMware has fixed a critical vulnerability (CVE-2020-3952) affecting vCenter Server, which can be exploited to extract highly sensitive information that could be used to compromise vCenter Server or other services which depend on the VMware Directory Service (vmdir) for authentication.

On my mind: Transitioning to third-party cloud servicesThe transition from traditional onsite data colocation to the use of third-party cloud shared tenant services should be on everyones minds. With this growing shift, everyone from individuals to enterprises will continue to fuel threat actors by improperly storing information in the cloud.

Using Cisco IP phones? Fix these critical vulnerabilitiesCisco has released another batch of fixes for a number of its products. Among the vulnerabilities fixed are critical flaws affecting a variety of Cisco IP phones and Cisco UCS Director and Cisco UCS Director Express for Big Data, its unified infrastructure management solutions for data center operations.

You have to consider cybersecurity at all points of a cloud migrationHuman error and complex cloud deployments open the door to a wide range of cyber threats, according to Trend Micro.

Phishing kits: The new bestsellers on the underground marketPhishing kits are the new bestsellers of the underground market, with the number of phishing kit ads on underground forums and their sellers having doubled in 2019 compared to the previous year, Group-IB reveals.

760+ malicious packages found typosquatting on RubyGemsResearchers have discovered over 760 malicious Ruby packages (aka gems) typosquatting on RubyGems, the Ruby communitys gem repository / hosting service.

Small businesses unprepared for remote working, most dont provide cybersecurity trainingThe overnight move to a virtual workplace has increased cybersecurity concerns for small business owners, but many still have not implemented remote working policies to address cybersecurity threats, according to a survey by the Cyber Readiness Institute (CRI).

Zoom in crisis: How to respond and manage product security incidentsZoom is in crisis mode, facing grave and very public concerns regarding the trust in managements commitment for secure products, the respect for user privacy, the honesty of its marketing, and the design decisions that preserve a positive user experience. Managing the crisis will be a major factor in determining Zooms future.

Are we doing enough to protect connected cars?Even though connected cars should meet the highest level of security, safety, and performance, we know this is not always the case. In this interview, Moshe Shlisel, CEO at GuardKnox, discusses todays most pressing issues related to automotive security.

The dangers of assumptions in securityAssuming things is bad for your security posture. You are leaving yourself vulnerable when you assume what you have is what you need, or what you have is working as advertised. You assume you are protected, but are you really?

Application security: Getting it right, from the startWhen you set out to design an application, you want to make sure it behaves as intended. In other words, that it does what you want, when its supposed to, and that it does so consistently.

Information security goes non-binaryFinding security holes in information systems is as old as the first commercially available computer. Back when a computer was something that sat in a computer room, users would try to bypass restrictions, sometimes simply by trying to guess the administrators password.

Office printers: The ticking IT time bomb hiding in plain sightOffice printers dont have to be security threats: with foresight and maintenance theyre very easily threat-proofed. The problem is that system administrators rarely give the humble printer (or scanner, or multifunction printer) much attention.

New lower pricing for CISSP, CCSP and SSCP online instructor-led trainingWhether youre studying for the CISSP, CCSP, SSCP or another industry leading (ISC) certification, (ISC) is here to help you stay on track to certification with our Official Online Instructor-Led training, now at a NEW LOWER PRICE.

US victims lose $13 million from COVID-19-related scamsSuccessful COVID-19-themed fraud attempts perpetrated in the US, since the beginning of the year resulted in a little over $13 million losses, the Federal Trade Commission has shared.

When your laptop is your workspace, the real office never closesWith the COVID-19 pandemic, working from home has moved from a company perk to a hard requirement. Social distancing government mandates have forced complete office closures completely transforming how and where people work. With people working from home and connected to business applications running in the cloud, the notion of an office building representing the company network has vanished overnight.

Shift to work-from-home: Most IT pros worried about cloud securityAs most companies make the rapid shift to work-from-home to stem the spread of COVID-19, a significant percentage of IT and cloud professionals are concerned about maintaining the security of their cloud environments during the transition, according to a survey conducted by Fugue.

New infosec products of the week: April 17, 2020A rundown of the most important infosec products released last week.

Visit link:
Week in review: Cloud migration and cybersecurity, data trending on the dark web, Zoom security - Help Net Security

What is edge computing? The benefits of mobile edge computing and 5G – Verizon Communications

Edge computing is based on bringing computing resources closer to users, at the edge of the network. By placing cloud resources physically near the source of the data, instead of in data centers hundreds or thousands of miles away, edge computing can help critical, performance-impacting applications respond more quickly and efficiently.

In todays network architecture, data is typically processed either on our devices, like PCs and smartphones, or in a centralized cloud (apps Gmail, Dropbox and others run in such a cloud). The cloud provides infrastructure, and other powerful capabilities like machine learning, and gives us unparalleled access to software and data, but performance can sometimes be slow or spotty. Edge computing attempts to overcome this performance issue.

Verizon first launched a Mobile Edge Compute service (MEC) with AWS in November. Were calling it Verizon 5G Edge. It utilizes all the benefits of 5G cellular technology to provide even faster access to the applications and data individuals and businesses need.

By the end of 2020, billions of connected devices are estimated to be added to cellular networks, requiring both wide spectrums of cellular frequencies as well as near-real time processing and minimal latency. Verizons 5G Ultra Wideband network should help deliver on those demands. 5G technology is expected to play a key role in increasing the speed at which data travels between two locations, and edge computing will help shorten the distance between the two.

Edge computing brings large servers and data centers, or the cloud, closer to the end user. This will help with situations like augmented reality, where that real-time nature of the data processing is critical.

Without edge computing, data would likely need to travel much further away to a central cloud server, and the resulting latency, or lag time, could be noticeably longer.

Additionally, edge computing is expected to have a positive impact on agriculture, remote healthcare, and manufacturing, among many other applications.

Verizon continues to develop 5G edge technology to revolutionize mobility and connectivity across devices. Learn more about what 5G is and all the implications for the technology of the future.

View post:
What is edge computing? The benefits of mobile edge computing and 5G - Verizon Communications

AI Could Save the World, If It Doesnt Ruin the Environment First – PCMag

When Mohammad Haft-Javaherian, a student at the Massachusetts Institute of Technology, attended MITs Green AI Hackathon in January, it was out of curiosity to learn about the capabilities of a new supercomputer cluster being showcased at the event. But what he had planned as a one-hour exploration of a cool new server drew him into a three-day competition to create energy-efficient artificial-intelligence programs.

The experience resulted in a revelation for Haft-Javaherian, who researches the use of AI in healthcare: The clusters I use every day to build models with the goal of improving healthcare have carbon footprints, Haft-Javaherian says.

The processors used in the development of artificial intelligence algorithms consume a lot of electricity. And in the past few years, as AI usage has grown, its energy consumption and carbon emissions have become an environmental concern.

I changed my plan and stayed for the whole hackathon to work on my project with a different objective: to improve my models in terms of energy consumption and efficiency, says Haft-Javaherian, who walked away with a $1,000 prize from the hackathon. He now considers carbon emission an important factor when developing new AI systems.

But unlike Haft-Javaherian, many developers and researchers overlook or remain oblivious to the environmental costs of their AI projects. In the age of cloud-computing services, developers can rent online servers with dozens of CPUs and strong graphics processors (GPUs) in a matter of minutes and quickly develop powerful artificial intelligence models. And as their computational needs rise, they can add more processors and GPUs with a few clicks (as long as they can foot the bill), not knowing that with every added processor, theyre contributing to the pollution of our green planet.

The recent surge in AIs power consumption is largely caused by the rise in popularity of deep learning, a branch of artificial-intelligence algorithms that depends on processing vast amounts of data. Modern machine-learning algorithms use deep neural networks, which are very large mathematical models with hundreds of millionsor even billionsof parameters, says Kate Saenko, associate professor at the Department of Computer Science at Boston University and director of the Computer Vision and Learning Group.

These many parameters enable neural networks to solve complicated problems such as classifying images, recognizing faces and voices, and generating coherent and convincing text. But before they can perform these tasks with optimal accuracy, neural networks need to undergo training, which involves tuning their parameters by performing complicated calculations on huge numbers of examples.

To make matters worse, the network does not learn immediately after seeing the training examples once; it must be shown examples many times before its parameters become good enough to achieve optimal accuracy, Saenko says.

All this computation requires a lot of electricity. According to a study by researchers at the University of Massachusetts, Amherst, the electricity consumed during the training of a transformer, a type of deep-learning algorithm, can emit more than 626,000 pounds of carbon dioxidenearly five times the emissions of an average American car. Another study found that AlphaZero, Googles Go- and chess-playing AI system, generated 192,000 pounds of CO2 during training.

To be fair, not all AI systems are this costly. Transformers are used in a fraction of deep-learning models, mostly in advanced natural-language processing systems such as OpenAIs GPT-2 and BERT, which was recently integrated into Googles search engine. And few AI labs have the financial resources to develop and train expensive AI models such as AlphaZero.

Also, after a deep-learning model is trained, using it requires much less power. For a trained network to make predictions, it needs to look at the input data only once, and it is only one example rather than a whole large database. So inference is much cheaper to do computationally, Saenko says.

Many deep-learning models can be deployed on smaller devices after being trained on large servers. Many applications of edge AI now run on mobile devices, drones, laptops, and IoT (Internet of Things) devices. But even small deep-learning models consume a lot of energy compared with other software. And given the expansion of deep-learning applications, the cumulative costs of the compute resources being allocated to training neural networks are developing into a problem.

Were only starting to appreciate how energy-intensive current AI techniques are. If you consider how rapidly AI is growing, you can see that we're heading in an unsustainable direction, saysJohn Cohn, IBM Fellow and research scientist with the MIT-IBM Watson AI Lab, who co-led the Green AI hackathon at MIT.

According to one estimate, by 2030, more than 6 percent of the worlds energy may be consumed by data centers. I don't think it will come to that, though I do think exercises like our hackathon show how creative developers can be when given feedback about the choices theyre making. Their solutions will be far more efficient, Cohn says.

CPUs, GPUs, and cloud servers were not designed for AI work. They have been repurposed for it, as a result, are less efficient than processors that were designed specifically for AI work, says Andrew Feldman, CEO and cofounder of Cerebras Systems. He compares the usage of heavy-duty generic processors for AI to using an 18-wheel-truck to take the kids to soccer practice.

Cerebras is one of a handful of companies that are creating specialized hardware for AI algorithms. Last year, it came out of stealth with the release of the CS-1, a huge processor with 1.2 trillion transistors, 18 gigabytes of on-chip memory, and 400,000 processing cores. Effectively, this allows the CS-1, the largest computer chip ever made, to house an entire deep learning model without the need to communicate with other components.

When building a chip, it is important to note that communication on-chip is fast and low-power, while communication across chips is slow and very power-hungry, Feldman says. By building a very large chip, Cerebras keeps the computation and the communication on a single chip, dramatically reducing overall power consumed. GPUs, on the other hand, cluster many chips together through complex switches. This requires frequent communication off-chip, through switches and back to other chips. This process is slow, inefficient, and very power-hungry.

The CS-1 uses a tenth of the power and space of a rack of GPUs that would provide the equivalent computation power.

Satori, the new supercomputer that IBM built for MIT and showcased at the Green AI hackathon, has also been designed to perform energy-efficient AI training. Satori was recently rated as one of the worlds greenest supercomputers. Satori is equipped to give energy/carbon feedback to users, which makes it an excellent laboratory for improving the carbon footprint both AI hardware and software, says IBMs Cohn.

Cohn also believes that the energy sources used to power AI hardware are just as important. Satori is now housed at the Massachusetts Green High Performance Computing Center (MGHPCC), which is powered almost exclusively by renewable energy.

We recently calculated the cost of a high workload on Satori at MGHPCC compared to the average supercomputer at a data center using the average mix of energy sources. The results are astounding: One year of running the load on Satori would release as much carbon into the air as is stored in about five fully-grown maple trees. Running the same load on the 'average' machine would release the carbon equivalent of about 280 maple trees, Cohn says.

Yannis Paschalidis, the Director of Boston Universitys Center for Information and Systems Engineering, proposes a better integration of data centers and energy grids, which he describes as demand-response models. The idea is to coordinate with the grid to reduce or increase consumption on-demand, depending on electricity supply and demand. This helps utilities better manage the grid and integrate more renewables into the production mix, Paschalidis says.

For instance, when renewable energy supplies such as solar and wind power are scarce, data centers can be instructed to reduce consumption by slowing down computation jobs and putting low-priority AI tasks on pause. And when theres an abundance of renewable energy, the data centers can increase consumption by speeding up computations.

The smart integration of power grids and AI data centers, Paschalidis says, will help manage the intermittency of renewable energy sources while also reducing the need to have too much stand-by capacity in dormant electricity plants.

Scientists and researchers are looking for ways to create AI systems that dont need huge amounts of data during training. After all, the human brain, which AI scientists try to replicate, uses a fraction of the data and power that current AI systems use.

During this years AAAI Conference, Yann LeCun, a deep-learning pioneer, discussed self-supervised learning, deep-learning systems that can learn with much less data. Others, including cognitive scientist Gary Marcus, believe that the way forward is hybrid artificial intelligence, a combination of neural networks and the more classic rule-based approach to AI. Hybrid AI systems have proven to be more data- and energy-efficient than pure neural-network-based systems.

It's clear that the human brain doesnt require large amounts of labeled data. We can generalize from relatively few examples and figure out the world using common sense. Thus, 'semi-supervised' or 'unsupervised' learning requires far less data and computation, which leads to both faster computation and less energy use, Cohn says.

Read this article:
AI Could Save the World, If It Doesnt Ruin the Environment First - PCMag

Edge AI Is The Future, Intel And Udacity Are Teaming Up To Train Developers – Forbes

IntelEdgeAIforIoTDevelopers

On April 16, 2020, Intel and Udacity jointly announced their new Intel Edge AI for IoT Developers Nanodegree program to train the developer community in deep learning and computer vision. If you are wondering where AI is headed, now you know, its headed to the edge. Edge computing is the concept of storing data and computing data directly at the location where it is needed. The global edge computing market is forecasted to reach 1.12 trillion dollars by 2023.

Theres a real need for developers worldwide in this new market. Intel and Udacity aim to train 1 million developers.

In the age of innovation, as data continues to grow, theres a real need for data storage and data computation to be located on the device. Privacy, security, and speed are the biggest reasons that the distributed Edge model can work better in certain use-cases. When we are all concerned about our personal data being stored in a cloud server, what if the app can handle our personal data on device instead. With Edge AI, personalization features that we want from the app can be achieved on device. Transferring data over networks and into cloud-based servers allows for latency. At each endpoint, there are security risks involved in the data transfer.

While cloud computing offers unquestionable economies of scale, a distributed computing model is driven by the nature of the data itself. The volume of data will make it difficult or expensive to move due to bandwidth costs or availability. The velocity of data will catalyze more real-time applications that cannot be limited by network latency. And the variety of data will be governed by regulatory, privacy and security constraints.

This is why the Edge AI Software market is forecasted to grow from $355 million in 2018 to 1.12 trillion dollars by 2023.

At the beginning of the AI evolution, we were concerned with crossing over from statistical models to data science, machine learning, and building algorithms that run on the Cloud. Now, software engineers increasingly find that their projects are naturally scoped to include an AI component. You dont have to be a machine learning engineer to know about deep learning or reinforcement learning.

Now, IoT developers who may have been sitting on the sidelines working on software projects that are more feature-based than data-based, will have an opportunity to get involved in the AI evolution. The Intel Edge AI for IoT Developers Nanodegree Program will introduce students to the Intel OpenVINO toolkit that will allow developers to deploy pre-trained deep learning models through a high-level C++ or Python inference engine AP integrated with application logic.

Students will work on Intel's IoT DevCloud to develop, test, and run their workloads on a cluster of the latest Intel hardware and software. Not only will IoT developers learn to apply AI in their applications, but they will also work on performance and other issues that arise with building data-centric applications.

Software engineers, machine learning engineers, data scientists and other technologists whove been working on Cloud-based AI applications now have a new direction to take in their learning path. Learning to develop Edge AI applications can allow a new perspective toward more user-driven application development of AI. With more user-based personalizations directly on the device, business solutions, features, and user data can be viewed from a more user-centric perspective.

This program will be beneficial for all developers who want to be involved in AI-based projects.

This Intel and Udacity collaboration will be the pioneer program that will lay the foundation of training in Edge AI for the next years to come. Just like Udacitys hugely popular machine learning and AI Nanodegree programs, Intel Edge AI for IoT Developers Nanodegree Program will facilitate streamlined training, project-based learning, mentorship and certification that will allow for a quick ramp-up of both AI and IoT development knowledge.

For computer science majors fresh out of school looking for an entry point into the industry, this type of program can offer both opportunities and skills that bridge the gap between education and real-world applications.

At a time when we are all concerned about our job security and prospects due to the coronavirus pandemic, its good to know that there are new paths to explore. If you dont currently work in manufacturing or healthcare, its difficult to envision Edge AI used on factory assembly lines, and in Urgent Care medical imaging equipment.

But, how about imagining the technology in drones, security cameras, robots and self-driving cars?

There are many smart phone apps that we use day to day that can potentially deliver more personalized features through an AI component.

The value and impact of Edge AI applications is showing no limits when it comes to use cases. The ingenuity of companies during this COVID-19 crisis is humbling. Industries including public safety and healthcare, for example are designing and deploying solutions now that leverage AI and computer vision technologies to deliver accurate and real-time insights to help with tracking, testing and treatment. The technology exists, we are only limited by the imagination at scale of the developer community.

This Edge AI evolution is the next generation of AI evolution that will change the way that we interact with our devices and offer better and more secure ways to deploy AI applications.

Students from the program will learn directly from experienced professionals in the Edge AI and IoT field such as Stewart Christie, who has been with Intel for almost 20 years, and is currently the Community Manager of the Internet of Things Developer Program; Archana Iyer, former Research Engineer at Saama; Soham Chatterjee, former Software Innovator at Intel; and Michel Virgo, Senior Curriculum Manager at Udacity.

The Project-based approach that Udacity uses will allows students to learn hands-on skills while interacting and receiving mentorships from experienced professionals. Many developers are familiar with this type of quick ramp up of development skillsets across multiple areas.

Projects in the Nanodegree program include:

If you are reluctant to commit, then try the free course Intel Edge AI Fundamentals free course or take advantage of Udacitys FREE ACCESS for one month.

Continue reading here:
Edge AI Is The Future, Intel And Udacity Are Teaming Up To Train Developers - Forbes

AMD Extends 2nd Gen AMD EPYC Processor Family with New Processors – IT News Online

IT News Online Staff2020-04-18

AMD has extended the 2nd Gen AMD EPYC processor family with three new processors that combine the balanced and efficient AMD Infinity architecture with higher speed "Zen 2" cores for optimal performance on database, commercial high-performance computing (HPC) and hyperconverged infrastructure workloads.

The AMD EPYC 7Fx2 processors provide new performance capabilities for workloads in the heart of the enterprise market including database with up to 17 percent higher SQL Server performance compared to the competition, hyperconverged infrastructure with up to 47 percent higher VMmark 3.1 score (using vSAN as the storage tier in a 4-node cluster) compared to the competition for a new world record, and commercial high-performance computing (HPC) with up to 94 percent higher per core computational fluid dynamics individual application performance compared to the competition.

"AMD EPYC continues to redefine the modern data center, and with the addition of three powerful new processors we are enabling our customers to unlock even better outcomes at the heart of the enterprise market," said Dan McNamara, senior vice president and general manager, server business unit, AMD. "With our trusted partners, together we are pushing the limits of per core performance and value in hyperconverged infrastructure, commercial HPC and relational database workloads."

A Balanced System That's More than Gigahertz

The new 2nd Gen AMD EPYC 7Fx2 processors provide leading per core performance and breakthrough value, while adding the highest per core performance of the EPYC family.

The performance of these new processors comes from a balanced architecture that combines high-performance "Zen 2" cores, innovations in system design like PCIe 4 and DDR4-3200 memory, and the AMD Infinity architecture, to provide customers with optimum system performance that enables better real world application performance.

Ecosystem Growing with AMD EPYC

The ecosystem of OEMs, cloud providers, ISVs and IHVs using 2nd Gen AMD EPYC processors continues to grow, with existing OEMs and new partners adopting the new AMD EPYC 7Fx2 processors.

Dell Technologies will support all three processors across its entire lineup of AMD EPYC based Dell EMC PowerEdge servers, including the R6525, which holds a world record 2P Four-Node Benchmark Result on VMmark 3 with VMware vSAN.

"These new AMD EPYC 7Fx2 processors enable Dell EMC PowerEdge servers to drive substantial performance benefits for customer business applications like database and hyperconverged infrastructure, where Dell EMC PowerEdge servers hold a world record in benchmark performance. Our customers will truly benefit from these new processors as we continue to grow our AMD EPYC family of PowerEdge platforms," said Rajesh Pohani, vice president, Server Platform Product Management, Dell Technologies.

HPE continues to expand its offerings using 2nd Gen AMD EPYC processors with latest support of HPE SimpliVity, an intelligent hyper-converged infrastructure solution. HPE will also support all three AMD EPYC 7Fx2 processors on the recently announced HPE Apollo 2000 Gen10 Plus system, HPE ProLiant DL385 Gen10 Plus server and HPE ProLiant DX servers.

"We are pleased to expand support of the 2nd Gen AMD EPYC processors across our portfolios, which include new additions with the HPE Apollo 2000 Gen10 Plus system, HPE ProLiant DL385 Gen10 Plus server and HPE ProLiant DX servers to meet high-frequency and performance needs for our customers in high-performance computing and database environments," said Peter Ungaro, senior vice president and general manager, HPC and Mission Critical Solutions (MCS), HPE.

IBM Cloud is the first cloud provider to offer its clients the AMD EPYC 7F72 processors in their bare metal offering, providing access to fast, high core-count dual socket bare metal servers. Additionally, IBM recently announced the availability of its first bare metal server powered by the AMD EPYC 7642 processor.

"We are excited to be the first cloud provider to support the new AMD EPYC 7F72 processor. Now, IBM Cloud provides access to another high core-count dual socket bare metal server with high clock speed frequency, giving our clients more optimized platform choices for compute-intense workloads such as analytics, commercial HPC and EDA. We stay committed to enabling flexible and powerful bare metal experiences for clients to enhance performance and throughput," said Satinder Sethi, general manager, IBM Cloud Infrastructure Services.

Lenovo will support the new AMD EPYC 7Fx2 processors on its ThinkSystem SR635 and SR655 platforms. These ThinkSystem platforms are already a great choice for a variety of enterprise workloads including data analytics, software defined storage and infrastructure for remote workers. Lenovo's storage and PCIe capabilities coupled with AMD EPYC core count and I/O density will help provide customers with choice as their business needs evolve. These new higher frequency 2nd Gen AMD EPYC processors, with an increased core clock speed up to 15 percent, in the single socket ThinkSystem platform, provides customers with greater options for workloads where per core performance is critical. Lenovo's one socket optimized platforms with these new processors allow customers to deploy these platforms where traditionally two socket systems were used, providing power and SW licensing costs savings.

"Today's business dynamics are presenting customers with new challenges to improve speed, cost and performance. We feel confident we have the right portfolio to provide our customers with enhanced choice as organizations look to enable remote working capabilities and manage their increased data and storage requirements," said Kamran Amini, vice president and general manager, Server, Storage and Software Defined Infrastructure, Lenovo Data Center Group.

Microsoft recognizes the impact the new AMD EPYC 7Fx2 processors have on providing Microsoft data platform customers the best experience possible, including an up to 17 percent higher SQL Server TPM per core performance. "Microsoft data platform solutions help customers release the potential hidden in data and reveal insights and opportunities to transform a business. A critical part of this process is making sure a database has access to an efficient, powerful and fast processor and that's exactly what the new AMD EPYC 7Fx2 processors provide Microsoft data platform solutions customers," said Jamie Reding, SQL Server program manager, Microsoft.

Nutanix, in conjunction with HPE, announced that it expects that Nutanix HCI software will be supported on select AMD EPYC based HPE ProLiant servers by May. As well, HPE announced the upcoming availability of AMD EPYC 7Fx2 processors on HPE ProLiant DX servers in Q3.

"We are excited to have validated Nutanix's HCI software for 2nd Gen AMD EPYC processor based HPE ProLiant systems. This will bring 2nd Gen AMD EPYC processor support to Nutanix software, giving more flexibility and choice to our customers while unleashing greater workload performance for databases, analytics, VDI and other virtualized business critical applications," said Tarkan Maner, chief commercial officer, Nutanix.

Supermicro is launching the industry's first blade platform built for 2nd Gen AMD EPYC processors with immediate support for the new AMD EPYC 7Fx2 processors combined with integrated 25G Ethernet and optional 100G EDR InfiniBand support with 200G HDR in the near future. In addition, all Supermicro A+ platforms including Ultra, GPU, WIO, Twin and Mainstream systems will support the new AMD EPYC 7Fx2 processors immediately.

"Adding the new SuperBlade platform to our extensive portfolio of products supporting the 2nd Gen AMD EPYC processors gives our customers another powerful choice when redefining their modern data center. Leveraging support for the new AMD EPYC 7Fx2 processors, our latest SuperBlade and Supermicro A+ platforms further excel at database, EDA and other data-intensive workloads," said Vik Malyala, senior vice president, Field Application Engineering and Business Development, Supermicro.

VMware has added support for the new 2nd Gen AMD EPYC 7Fx2 processors, providing customers with access to powerful virtualization platforms.

"The 2nd Gen AMD EPYC 7Fx2 processors bring new value to VMware customers. They provide a unique balance of strong per core performance coupled with an industry-leading per-processor memory capacity of 4 TB. A key element of VMware vSphere, vSAN and now VMware Cloud Foundation market success has been our commitment to helping customers quickly adopt the latest hardware innovation," said Richard A. Brunner, chief technology officer, Server Platform Technologies, VMware.

The new processors are available now through multiple OEMs and IBM Cloud.

See more here:
AMD Extends 2nd Gen AMD EPYC Processor Family with New Processors - IT News Online

The Outlook For Infrastructure Is Cloudy In A Good Way – The Next Platform

If we are ever going to know what affect the coronavirus pandemic has had on the IT sector, we have to keep track of what was going on before the outbreak started to hit us hard in the first quarter of 2020. This is, in part, why the data we have from 2018 and 2019 is going to be important. It establishes a baseline for what was normal then and it will help us reckon a baseline for what will be normal on the other side of this.

To that end, we have cast an eye on the latest cloud IT infrastructure spending statistics compiled by IDC, which cover spending on servers, storage, and switching for cloud and non-cloud deployments in both public and private clouds and in on premises datacenters for the plain vanilla, non-cloud (yet presumably still virtualized in some cases) systems. The lines are fuzzy between some of these categories, admittedly, but we are really most interested in the deltas, not the theological arguments over what is a cloud and what is not.

As we have reported previously, server infrastructure spending by hyperscalers and cloud builders was muted starting in late 2018 and continued into the first two quarters of 2019, rebounding strongly in the second half of 2019. We called this the last hurrah before the server recession, and our analysis of this can be found here. This latest cloud spending data from IDC talks about the market more broadly, adding in all core IT spending on the three main types of hardware (and removing double counting between servers and storage and servers and switching where they are converged) to give what could be a more accurate view of what is actually going on out there. We have also looked at IDCs original IT spending forecasts, which predicted only curtailed spending growth, as the coronavirus outbreak was starting to build momentum and its revision of spending forecasts more recently, which show an actual decline in spending for 2020.

None of this data on cloud and non-cloud IT spending that we are talking about in this story has a specific forecast for 2020 or 2021, but the company did share some insight that echoes what was said in the more broad IT spending analysis and did offer an end state for spending way out in 2024.

While the beginning of 2020 was marked by supply chain issues that should be resolved before the end of the second quarter, the negative economic impact will hit enterprise customers capex spending, explained Kuba Stolarski, research director for infrastructure systems, platforms and technologies at IDC, in a statement accompanying the figures. As enterprise IT budgets tighten through the year, public cloud will see an increase in demand for services. This increase will come in part from the surge of work-from-home employees using online collaboration tools, but also from workload migration to public cloud as enterprises seek ways to save money for the current year. Once the coast is clear of coronavirus, IDC expects some of this new cloud service demand to remain sticky going forward.

Lets go over the numbers for the fourth quarter of 2019 and all of 2019, and then take a look at the 2024 forecast.

The fourth quarter was a pretty good one, a kind of return to normal of its own after a downturn in spending. In the period ended in December, spending on infrastructure by public clouds (stuff meant to be rented out or to support services offered by companies) rose by 14.5 percent to $13.3 billion, while spending on private cloud infrastructure (installed on premises by enterprises, governments, and other institutions for their own use) rose by 8.2 percent to $6.1 billion, according to IDC. Add them up, and total cloud spending on infrastructure rose by 12.5 percent to $19.4 billion. Overall IT infrastructure spending was $38.1 billion in Q4 2019, up 3.3 percent, and spending on non-cloud (meaning traditional free-standing bare metal iron or perhaps only rudimentary server virtualization, depending on where IDC draws the cloud line) IT infrastructure was down 5.5 percent to $18.6 billion.

Here is what the data looks like since 2014, when IDC first started breaking down the market this way across these categories, including a sum of public cloud spending (yellow line) and private cloud spending (the blue line), which is the green line:

For all the talk about how spending on the cloud has taken over the world, by which people often hear public cloud when the word public is not there, it is important to remember how much traditional, non-cloud infrastructure is still acquired in the world. And it is also important to remember that in 2018, 66.4 percent of all spending was done by organizations other than hyperscalers and cloud builders, and in 2019, that level held at 66 percent. Yes, it is down from $89.6 billion to $88.1 billion from 2018 to 2019, but it is still about two-thirds of all infrastructure spending. So hyperscalers and public clouds have not taken over the world, and we can paint a scenario where organizations wont want to pay the cloud premium after the Great Infection or worry about whether or not there is capacity available for them and will dig in on their own datacenters.

It will be interesting to see how this plays out. We dont think anyone can call it quite yet.

Here is what the full breakdown of IT spending for 2018 and 2019 by category, looks like:

Here is what the breakdown by vendor for cloudy infrastructure looks like:

And here is the vendor trend since 2014:

The ODMs as a group are a proxy for hyperscaler and cloud builder spending, but some of these customers also go to large OEMs for some of their gear, too. So it is not a perfect fit. But what we also think needs to be remembered is that the ODMs account for somewhere between 30 percent and 35 percent of total cloud infrastructure revenues and this is only 13 percent to 17 percent of total spending on servers, storage, and switching. Perspective is important.

Within these three technology domains and within the cloud infrastructure category, IDC reckons that storage platforms grew by 15.1 percent in the fourth quarter of last year, to $6.6 billion, while compute platforms rose by 14.5 percent to $10.8 billion in aggregate sales worldwide. Sales of Ethernet switches fell by 3.9 percent to $2 billion. For all of 2019, compute comprised $35.5 billion in cloud infrastructure spending (up 1.5 percent), followed by storage with $23.1 billion of spending (up 1.9 percent) and by Ethernet switching of $8.2 billion (up 5 percent). What that tells us is the breakdown of spending for distributed computing systems: 53.1 percent of the budget for cloudy stuff was spent on compute, 34.6 percent was spent on storage, and 12.3 percent was spent on switching.

Now, you can reckon how you stand compared to the market at large. The hyperscalers and cloud builders spend more on compute and less on switching, so we are told. But that may be more of a reflection of uneven discounting than anything else.

And now, the forecast. Looking ahead to spending between 2019 and 2024, IDC says that cloud IT infrastructure spending will reach $100.1 billion a year by 2024, with a compound annual growth rate of 8.4 percent over that term. Non-cloud IT infrastructure spending will decline at a 0.7 percent CAGR over that same time to $65.3 billion, and total IT infrastructure spending will rise at a 4.2 percent CAGR to hit $165.4 billion in 2024. Presumably, this forecast has a pretty bad 2020 and a weak 2021 in it.

Read the original:
The Outlook For Infrastructure Is Cloudy In A Good Way - The Next Platform

To keep civil infrastructure projects moving, look to the cloud – Smart Cities Dive

Editor's Note: The following is a guest post from Orla Pease, vice president of digital innovation at AECOM.

More than 5.2 million Americans filed for unemployment last week, bringing the total number of jobless claims to around 22 million in the last month, all as a result of the COVID-19 pandemic.

One way we can help soften the impact of the pandemic is to enable people to continue working and contributing to their families and the economy all while adhering to social distancing directives. Moving civil infrastructure projects into the cloud is a simple step that companies and agencies can take to keeps projects running, people working and the economy moving during this unprecedented crisis.

As one of the oldest industries in the world, the civil engineering and construction (E&C) sector has a reputation for lagging other industries when it comes to digitization and virtual design. Companies that embrace digital technologies to improve how they do business, rather than to ride the wave of digital transformation and keep pace with the competition, often fare much better with these initiatives.

I have firsthand experience of the power of digital transformation born out of the need to collaborate across a global organization,and the results have not only brought us closer to our colleagues but to our clients as well.

The coronavirus pandemic has accelerated the need to adopt digital ways of working and has made it clear that E&C cannot afford to rest on the laurels of traditional ways of working or be satisfied with the occasional digital innovation. We must make changes, quickly, to become digitally resilient and that starts with transitioning civil infrastructure projects from local servers to the cloud.

This simple step can keep civil infrastructure projects moving so they will be shovel -ready when recovery begins, helping to keep America working now and when the crises subsides. E&C companies that are solidly on the path to digital transformation are uniquely positioned to bring their clients along on the journey, and it all starts with a cornerstone of digital resilience: the cloud.

Its time the industry stops perceiving the digital transformation as an experiment in competitiveness, and instead sees it as a necessity of resilience. The term "digital transformation"itself can have the impact of analysis paralysis seeming to mean that everything in an organization must transform. Rather than trying to go digital all at once or in fragmented efforts, focus first on enabling virtual collaboration and digitizing civil infrastructure projects so we can keep working and keep the economy moving during the coronavirus pandemic.

To keep up with all of our coverage on how the new coronavirus is impacting U.S. cities, visit our daily tracker.

Continued here:
To keep civil infrastructure projects moving, look to the cloud - Smart Cities Dive