Page 20«..10..19202122..3040..»

Cloud Computing in Construction: Trends identified by GlobalData – DesignBuild Network

More companies are adopting cloud-first strategies indicating the long-term importance of the cloud. Adopting the cloud and associated technologies can help develop strategies to protect against universal industry challenges.

Listed below are the key trends impacting cloud computing in construction, as identified by GlobalData.

The increasing complexity of projects is leading to significant challenges in delivering on time and budget. Project monitoring is primarily focused on monitoring cost overruns and delays. A forward-looking, smart project management approach is needed to help translate data insights into actions and help keep projects on track and costs down.

Construction will need to adapt as consumers and corporations turn to more sustainable practices. Sustainability in construction encompasses the use of more sustainable materials in construction and improved management of construction waste. As global demands on the industry continue to increase, efficient resource management is needed to address these issues.

Contractors face challenges in securing people, materials, and equipment in the right place at the right time. Issues of long delays, bureaucratic procurement processes, and economic factors all drive up project costs. At the same time, the negative perception of the industry is a significant obstacle in attracting the young and talented to help reinvigorate the industry.

Increased project complexity, low-profit margins, and a shifting competitive landscape have all put pressure on profitability and have made cost control vital for survival. Firms have seen project expenses increased steadily in the past few years due to rising material costs and future costs are likely to increase even more.

As projects become increasingly challenging and complex, higher levels of coordination are needed between all stakeholders, including project owners, designers, engineers, and contractors. Construction companies are looking to collaborate across the construction cycle through integrated contracts, joint ventures, or mergers and acquisitions (M&A).

As companies have adopted new technologies, the amount of data produced has increased. Data is quickly becoming an asset in helping effective decision making. To maximise return on investment for a project, data must be collected, stored, managed, and used efficiently.

The industry has been traditionally slow to adopt technology. However, productivity has been idle in the construction sector due to the extensive regulation, a fragmented construction cycle, and misaligned incentives among contractors and owners. Thin margins have no doubt invested in digital innovation difficult in recent years, but the productivity-boosting effects of digital are gradually being realised and becoming more widespread in construction.

Despite improvements, construction remains a dangerous profession. Although improvements can come from better enforcement of health and safety regulations and training, advances are being made in the adoption of new technologies and the use of robotics to replace the need for humans to undertake dangerous tasks.

As more of the construction process becomes digitalised, vulnerabilities have arisen from the fragmentation of construction into multiple enterprises. There is growing concern over the security of data and the threat of cyber-attacks.

This is an edited extract from the Cloud Computing in Construction Thematic Research report produced by GlobalData Thematic Research.

Precision-Woven Fabric Manufacturing for Interior and Exterior Use

3D Architectural Rendering Software for BIM Professionals

Precision-Woven Fabric Manufacturing for Interior and Exterior Use

28 Aug 2020

3D Architectural Rendering Software for BIM Professionals

28 Aug 2020

More here:
Cloud Computing in Construction: Trends identified by GlobalData - DesignBuild Network

Read More..

Looking to the future of quantum cloud computing – Siliconrepublic.com – Siliconrepublic.com

Trinity College Dublins Dan Kilper and University of Arizonas Saikat Guha discuss the quantum cloud and how it could be achieved.

Quantum computing has been receiving a lot of attention in recent years as several web-scale providers race towards so-called quantum advantage the point at which a quantum computer is able to exceed the computing abilities of classical computing.

Large public sector investments worldwide have fuelled research activity within the academic community. The first claim of quantum advantage emerged in 2019 when Google, NASA and Oak Ridge National Laboratory (ORNL) demonstrated a computation that the quantum computer completed in 200 seconds and that the ORNL supercomputer verified up to the point of quantum advantage, estimated to require 10,000 years to complete to the end.

Roadmaps that take quantum computers even further into this regime are advancing steadily. IBM has made quantum computers available for online access for many years now and recently Amazon and Microsoft started cloud services to provide access for users to several different quantum computing platforms. So, what comes next?

The step beyond access to a single quantum computer is access to a network of quantum computers. We are starting to see this emerge from the web or cloud-based quantum computers offered by cloud providers effectively quantum computing as a service, sometimes referred to as cloud-based quantum computing.

This consists of quantum computers connected by classical networks and exchanging classical information in the form of bits, or digital ones and zeros. When quantum computers are connected in this way, they each can perform separate quantum computations and return the classical results that the user is looking for.

It turns out that with quantum computers, there are other possibilities. Quantum computers perform operations on quantum bits, or qubits. It is possible for two quantum computers to exchange information in the form of qubits instead of classical bits. We refer to networks that transport qubits as quantum networks. If we can connect two or more quantum computers over a quantum network, then they will be able to combine their computations such that they might behave as a single larger quantum computer.

Quantum computing distributed over quantum networks thus has the potential to significantly enhance the computing power of quantum computers. In fact, if we had quantum networks today, many believe that we could immediately build large quantum computers far into the advantage regime simply by connecting many instances of todays quantum computers over a quantum network. With quantum networks built, and interconnected at various scales, we could build a quantum internet. And at the heart of this quantum internet, one would expect to find quantum computing clouds.

At present, scientists and engineers are still working on understanding how to construct such a quantum computing cloud. The key to quantum computing power is the number of qubits in the computer. These are typically micro-circuits or ions kept at cryogenic temperatures, near minus 273 degrees Celsius.

While these machines have been growing steadily in size, it is expected that they will eventually reach a practical size limit and therefore further computing power is likely to come from network connections across quantum computers within the data centre, very much like todays current classical computing data centres. Instead of racks of servers, one would expect rows of cryostats.

Quantum computing distributed over quantum networks has the potential to significantly enhance the computing power of quantum computers

Once we start imagining a quantum internet, we quickly realise that there are many software structures that we use in the classical internet that might need some type of analogue in the quantum internet.

Starting with the computers, we will need quantum operating systems and computing languages. This is complicated by the fact that quantum computers are still limited in size and not engineered to run operating systems and programming the way that we do in classical computers. Nevertheless, based on our understanding of how a quantum computer works, researchers have developed operating systems and programming languages that might be used once a quantum computer of sufficient power and functionality is able to run them.

Cloud computing and networking rely on other software technologies such as hypervisors, which manage how a computer is divided up into several virtual machines, and routing protocols to send data over the network. In fact, research is underway to develop each of these for the quantum internet. With quantum computer operating systems still under development, it is difficult to develop a hypervisor to run multiple operating systems on the same quantum computer as a classical hypervisor would.

By understanding the physical architecture of quantum computers, however, one can start to imagine how it might be organised to support different subsets of qubits to effectively run as separate quantum computers, potentially using different physical qubit technologies and employing different sub-architectures, within a single machine.

One important difference between quantum and classical computers and networks is that quantum computers can make use of classical computers to perform many of their functions. In fact, a quantum computer in itself is a tremendous feat of classical system engineering with many complex controls to set up and operate the quantum computations. This is a very different starting point from classical computers.

The same can be said for quantum networks, which have the classical internet to provide control functions to manage the network operations. It is likely that we will rely on classical computers and networks to operate their quantum analogues for some time. Just as a computer motherboard has many other types of electronics other than the microprocessor chip, it is likely that quantum computers will continue to rely on classical processors to do much of the mundane work behind their operation.

With the advent of the quantum internet, it is presumable that a quantum-signalling-equipped control plane might be able to support certain quantum network functions even more efficiently.

When talking about quantum computers and networks, scientists often refer to fault-tolerant operations. Fault tolerance is a particularly important step toward realising quantum cloud computing. Without fault tolerance, quantum operations are essentially single-shot computations that are initialised and then run to a stopping point that is limited by the accumulation of errors due to quantum memory lifetimes expiring as well as the noise that enters the system with each step in the computation.

Fault tolerance would allow for quantum operations to continue indefinitely with each result of a computation feeding the next. This is essential, for example, to run a computer operating system.

In the case of networks, loss and noise limit the distance that qubits can be transported on the order of 100km today. Fault tolerance through operations such as quantum error correction would allow for quantum networks to extend around the world. This is quite difficult for quantum networks because, unlike classical networks, quantum signals cannot be amplified.

We use amplifiers everywhere in classical networks to boost signals that are reduced due to losses, for example, from traveling down an optical fibre. If we boost a qubit signal with an optical amplifier, we would destroy its quantum properties. Instead, we need to build quantum repeaters to overcome signal losses and noise.

Together we have our sights set on realising the networks that will make up the quantum internet

If we can connect two fault-tolerant quantum computers at a distance that is less than the loss limits for the qubits, then the quantum error correction capabilities in the computers can in principle recover the quantum signal. If we build a chain of such quantum computers each passing quantum information to the next, then we can achieve the fault-tolerant quantum network that we need. This chain of computers linking together is reminiscent of the early classical internet when computers were used to route packets through the network. Today we use packet routers instead.

If you look under the hood of a packet router, it is composed of many powerful microprocessors that have replaced the computer routers and are much more efficient at the specific routing tasks involved. Thus, one might imagine a quantum analogue to the packet router, which would be a small purpose-built quantum computer designed for recovering and transmitting qubits through the network. These are what we refer to today as quantum repeaters, and with these quantum repeaters we could build a global quantum internet.

Currently there is much work underway to realise a fault-tolerant quantum repeater. Recently a team in the NSF Center for Quantum Networks (CQN)achieved an important milestone in that they were able to use a quantum memory to transmit a qubit beyond its usual loss limit. This is a building block for a quantum repeater. The SFI Connect Centre in Ireland is also working on classical network control systems that can be used to operate a network of such repeaters.

Together we have our sights set on realising the networks that will make up the quantum internet.

By Dan Kilper and Saikat Guha

Dan Kilper is professor of future communication networks at Trinity College Dublin and director of the Science Foundation Ireland (SFI) Connect research centre.

Saikat Guha is director of the NSF-ERC Center for Quantum Networks and professor of optical sciences, electrical and computer engineering, and applied mathematics at the University of Arizona.

Link:
Looking to the future of quantum cloud computing - Siliconrepublic.com - Siliconrepublic.com

Read More..

The acceleration to cloud is causing a monitoring migraine – www.computing.co.uk

From a technology perspective, one of the biggest changes we've seen over the past year has been a dramatic acceleration in cloud computing initiatives. The pandemic has proven once and for all that cloud computing really does work, even in the most challenging of circumstances, providing greater speed, agility and resilience.

And with this new level of trust and appreciation of cloud computing, huge numbers of businesses have gone from running only a handful of applications in the cloud to wanting to shift significant parts of their IT estate over to a cloud environment, as quickly as they possibly can.

Indeed, as organisations have rushed through digital transformation programs to deliver new digital services to both customers and employees during the pandemic, most have relied heavily on the cloud to enable them to move at the required speed and scale.

The pandemic will certainly come to be seen as a tipping point in the transition to cloud computing, speeding up what was already an inevitable switch by several years. Indeed, Gartner has forecast that worldwide end-user spending on public cloud services will grow by 18.4 per cent in 2021, and that the proportion of IT spending on cloud computing will make up 14.2 per cent of the total global enterprise IT spending market in 2024, up from 9.1 per cent in 2020.

This marked shift towards cloud computing is undoubtedly delivering benefits, enabling the digital transformation initiatives organisations have relied on throughout the pandemic. In many cases, the level and speed of innovation that has been achieved simply wouldn't have been possible using legacy technologies.

However, there is always a sting! The rapid acceleration of cloud initiatives has had a profound impact on the IT department, adding huge complexity and even greater pressure onto technologists.

In our latest Agents of Transformation report, Agents of Transformation 2021: The Rise of Full-Stack Observability, we found that 77 per cent of global technologists are experiencing greater levels of complexity as a result of the acceleration of cloud computing initiatives during the pandemic. And 78per cent cited technology sprawl and the need to manage a patchwork of legacy and cloud technologies as an additional source of complexity.

On the back of rapid digital transformation over the past year, technologists have rightly put even more focus on monitoring the entire IT estate, from customer-facing applications through to third party services and core infrastructure like network and storage. But whilst to a large degree their established monitoring approaches and tools have provided them greater visibility across traditional, legacy environments, they have been found wanting within new hybrid cloud environments.

The reason for this is that within a software-defined, cloud environment, nothing is fixed; everything is constantly changing in real-time. And that makes monitoring far more difficult.

Traditional approaches to monitoring were based on physical IT infrastructure - technologists knew they were operating five servers and 10 network wires - they were dealing with constants. This then allowed for fixed dashboards for each layer of the IT stack. But the nature of cloud computing is that organisations are continually scaling their use of IT up and down, according to business need. For instance, a company might be using two servers to support a customer-facing application, but then suddenly increase that to 25 servers to meet a surge in demand in real-time, before dropping back down to five a few hours later while adapting its network and storage infrastructure along the way.

Traditional monitoring solutions simply aren't designed for this dynamic use of infrastructure as code, and that means most technologists can no longer get visibility of their full IT stack health in a single pane of glass. In fact, three-quarters of technologists now report they are being held back because they have multiple, disconnected monitoring solutions, and worryingly, more than two-thirds admit they now waste a lot of time as they can't easily isolate where performance issues are actually happening. The acceleration of cloud computing initiatives is undoubtedly the major driver of this issue.

Looking ahead, technologists are under no illusions: the transition to the cloud is only going to gather pace, as organisations continue to prioritise digital transformation to get through the pandemic and exploit new opportunities in a turbulent marketplace.

Technologists are also fully aware that unless they find a way to gain greater visibility and insight into all IT environments, they will be unable to drive the rapid, sustainable digital transformation their organisations need. Indeed, 79per cent of technologists state that they need to adopt more comprehensive observability tools to achieve their organisations' innovation goals.

Without genuine full-stack observability, technologists simply don't stand a chance of being able to quickly identify and fix technology issues before they impact end users and the business.

IT and business leaders need to recognise that unless they address this issue now, they are jeopardising all of their efforts and investment in digital transformation. Organisations can develop the most innovative, cloud-based applications for their customers and staff, but unless their technologists have the right level of visibility and tools to optimise IT performance in real-time, then they will never be able to deliver faultless digital experiences.

Technologists need to be able to monitor all technical areas across their IT stack, including within cloud environments, and to directly link technology performance to end user experience and business outcomes, so they can prioritise actions and focus on what really matters to the business. Get this right, and then organisations really can start to take full advantage of the cloud.

James Harvey is EMEAR CTO at Cisco AppDynamics

Visit link:
The acceleration to cloud is causing a monitoring migraine - http://www.computing.co.uk

Read More..

A Chance to Tap Cloud & 5G Through the Upcoming iShares’ ETF – Zacks.com

Cloud computing and 5G have been hot investing areas lately thanks to higher demand and stupendous stock market gains of the industry players. Now wonder, iShares Trust is on its way to launch an ETF on the dual concepts. The name of the proposed fund is iShares Cloud 5G and Tech ETF IDAT.

The iShares Cloud 5G and Tech ETF looks to track the investment results of an index composed of companies from developed and emerging markets that could benefit from providing products, services, and technologies related to cloud computing and 5G. The fund would charge 47 bps in fees (read: 5 Most-Crowded Trades & Their Winning ETFs).

Cloud computing is a process in whichdata or software is stored outside of a computer, but can be easily accessed anywhere, at any time via the Internet.This idea is effective as it helps firms to lower IT costs by eliminating the need for servers and related maintenance costs.

In the wake of the pandemic, cloud technology adoption is projected to witness robust growth in sectors where work-from-home initiatives are sustaining business functions. Globally, end-user spending on public cloud services is forecast to grow 23% in 2021 to a total $332.3 billion,according to Gartner (read: A Comprehensive Guide to Cloud Computing ETFs).

On the other hand, 5G, the next era of smarter, faster and more efficient wireless technology, has lately picked up pace. The initial round of rollouts has been gathering steam globally. It is operational in many major cities in the United States, as well as places in China, South Korea and the United Kingdom, among other countries.

Carriers are busy building foundations. Phone makers have also started launching 5G-enabled handsets. Investors should note that apart from the faster usage of mobile networks, 5G is going to strengthen the mechanism of the growing Internet of Things (IoT) so that a human-to-object interaction can be set up smoothly (read: 5G Gaining Immense Traction: ETFs to Bet On).

Defiance Next Gen Connectivity ETF (FIVG Quick QuoteFIVG - Free Report) , Pacer Benchmark Data & Infrastructure Real Estate Sector ETF (SRVR), ALPS Disruptive Technologies ETF (DTEC) and First Trust Indxx NextG ETF (NXTG) are some of the ETFs that have exposure to 5G-enabled companies.

On the other hand, First Trust Cloud Computing ETF (SKYY), Global X Cloud Computing ETF (CLOU Quick QuoteCLOU - Free Report) , WisdomTree Cloud Computing ETF (WCLD) and Wedbush ETFMG Global Cloud Technology ETF (IVES) are the ETFs that thrive on cloud computing.

Hence, there is tough competition in the space, though two concepts in one fund could be a winning proposition for IDAT, if it is ever approved.

Zacks free Fund Newsletter will brief you on top news and analysis, as well as top-performing ETFs, each week.Get it free >>

Read more here:
A Chance to Tap Cloud & 5G Through the Upcoming iShares' ETF - Zacks.com

Read More..

Cloud migration and the catch-22 conundrum – ITWeb

Cloud computing is changing the way we do business. It offers increased scalability, flexibility and the opportunity to easily collaborate with fellow workers, customers and other stakeholders.

Cloud computing also enables software homogenisation across the business, giving every staff member access to the same current and updated software.

Today, increasing numbers of businesses are using cloud computing services in various forms. According to Gartner research, by 2022, almost 90% of all businesses will operate in the cloud to a greater or lesser extent.

While the remote-working trend has largely driven many organisations to adopt a cloud-based philosophy, it has also given them access to the big business benefits of increased processing power and improved data storage capacities.

However, as important and worthy as investments in cloud technologies are, there are challenges associated with cloud-migration moves.

For example, in their haste to adopt cloud-based solutions, some companies are putting their remaining traditional on-premises infrastructures in jeopardy by placing scheduled network updates on the back-burner.

Conversely, other organisations minimise their cloud migration efforts in a bid to extend the life of existing traditional network assets in order leverage capital investments in them.

The nett result is an increase in obsolete and unpatched devices containing software vulnerabilities with a number of networks exposed to information security threats.

In most cases, the advice of specialists is necessary to assist organisations in the planning phase ahead of their cloud journey.

As acclaimed research scientist and author Daniel Hein says: if your business isnt prepared to deal with the challenges of cloud migration, then it could be costly and dangerous for you and your data.

In short, outdated hardware and applications create vulnerabilities, making them easy targets for hackers whose goal is to infiltrate these increasingly-flawed networks.

There are other circumstances that can impact cloud migration and help prolong the life of outdated infrastructures.

For instance, some organisations take the view that cloud migration, with the addition of new cloud assets and networked devices, will introduce a significant degree of complexity into their IT operations.

They then plan for any cloud migration activities to take effect only after existing IT staff have reached the skill levels required to manage, integrate and maintain the processes.

Moreover, the upskilling of these employees is often seen as an addition to their current responsibilities for on-premises IT management and maintenance which may suffer as a result.

Research findings reveal that more than 50% of companies find cloud migration more difficult than expected, with projects exceeding budgets and missing completion deadlines. This is particularly true for organisations burdened with older on-premises implementations.

This creates a catch-22 situation with businesses holding on to aging, underperforming IT platforms, hoping to postpone the evil day when a move to cloud computing becomes imperative.

However, as many network managers will confirm, the older the technology, the more costly it becomes to effect an update or repair. Therefore, a reliance on outdated solutions will negatively impact business agility and limit an organisations ability to adapt quickly to market changes such as the work-from-home movement.

Similarly, such an imprudent strategy will also impair an organisations capability to respond rapidly to changing customer demands.

Of course, there are isolated instances where cloud migration may be delayed by special circumstances, such as a reliance on proprietary technology which, for legal reasons, may be unable to be deployed to the cloud.

Against this backdrop, making the move away from a traditional IT environment to cloud computing must be seen as a major step, with decision-making certain to impact the company from many aspects, including but not limited to bottom-line profitability and medium- to long-term growth.

As Ron Lopez, executive vice-president of NTT, notes in a published statement: The network is the platform for business digital transformation. It needs to be ubiquitous, flexible, robust and secure to adapt easily to business change, while increasing the maturity of the operational support environment.

In this light, businesses are best advised to take the earliest opportunity to appraise strategies related to their network and security architectures, and review plans for operating and support models. The objective should be to better manage operational risk and achieve a degree of maturity in operational support structures.

In most cases, the advice of specialists is necessary to assist organisations in the planning phase ahead of their cloud journey which needs to be a smooth and seamless experience.

As eminent IT industry luminary Josh LeSov says: The biggest challenge companies face when migrating to the cloud is their preparedness. You need to work with a seasoned implementation team that has strong project management skills, system experience and industry expertise.

Additionally, this team needs to be able to stick around after the implementation is done since on-going support is always required.

Full visibility into an organisations IT infrastructure before, during and after cloud migration is imperative, as is the adoption of modern technologies and techniques in order to eliminate potential pitfalls in the process which might otherwise compromise data, applications and day-to-day business activities.

Importantly, with budgets under ever-increasing pressure, costs must be accurately predicted and expertly managed. It is vital for in-house and consulting teams focusing on IT, security and operations to be on the same page in order to create a successful cloud migration blueprint.

View original post here:
Cloud migration and the catch-22 conundrum - ITWeb

Read More..

The future of asset health is in the cloud – Canadian Mining Journal

Emerging echnologies in predictive maintenance demand a cloud infrastructure for their unique capabilities remote data storage and aggregation, machine learning, and IIoT-based automation. Credit: Wenco

Asset health technologies have transformed the reliability of mining equipment over the past generation. By tapping into the equipments onboard sensors, maintenance teams can observe and record hundreds of parameters that indicate equipment health. Understanding this data and its effects has empowered mines to expand mean time before failure (MTBF), uptime, and other maintenance KPIs more than any tools in recent memory.

Yet, these technologies have their limitations. When installed exclusively on premises, asset health systems miss the advantages available with the power of cloud computing. In 2021, many innovations in predictive maintenance demand a cloud infrastructure and its unique capabilities to deliver optimal value. Remote data storage and aggregation, access to machine learning algorithms, and IIoT automation all rely on cloud technologies that are increasingly necessary elements in a forward-thinking mine maintenance program.

Fortunately, advances in data processing and communications technologies are making cloud solutions more viable for the mining industry. While traditionally resistant to cloud implementations, mines are now leveraging the capabilities of cloud computing, and their maintenance departments are seeing the benefits. New solutions are empowering maintenance teams to do their jobs better in ways that were impossible a few years ago predicting component fatigue from early warning signs at the edge, observing changes in equipment performance on a continuous basis, and even collaborating with OEMs on proactive asset management that leverages integrated digital platforms.

Real-time analytics, now at the edge

Edge devices installed on mobile and plant equipment are the point of entry for much of the data in any asset health infrastructure. Traditionally, these low-powered hardware units provided simple data processing near the source of operation, streaming that information to a cloud server for aggregation with other datasets and cross-platform analysis.

While this configuration can work well, the wealth of sensors and data now available to mines and their maintenance teams often proves too voluminous and costly to manage in this way. Bandwidth restrictions and communication costs mean that traditional cloud infrastructures struggle to handle the requirements of emerging IIoT systems. Instead, new solutions see more and more calculations happening at the edge itself.

Long-established vendors like Emerson, as well as startups like FogHorn, are bringing advanced capabilities like analytics and AI to lightweight devices near the source of a data stream. Todays edge devices are able to take raw sensor data temperature, pressure, vibration, events, and more and perform complex computations independent of a powerful cloud server. Data ingestion, processing, and reporting can now happen near the source, providing real-time, cost-effective insights to maintenance personnel. After that time-sensitive information has been communicated, the systems can publish compressed data to their cloud counterparts for richer analysis and long-term storage.

Its a two-way street, says Vien Dang, asset health specialist for Wenco International Mining Systems. Edge and cloud solutions work together. You train edge devices using a cloud-hosted model of what a healthy equipment unit looks like, then set it loose to respond to real-world applications.

Reliability teams get clean, accurate reporting quickly so they can respond quickly. Then, that data feeds up to the cloud, improving the model they started with. Over time, the whole process gets faster, more accurate, and more responsive with very little latency or bandwidth issues.

Digital twins deliver precise, specific asset health modelling

Todays inexpensive sensors and edge devices can easily produce vast streams of data, but making sense of it is another challenge. Often, maintenance teams have access to volumes of data, but lack useful information to diagnose emerging problems and intervene to prevent failures.

Rithmik Solutions is changing that. The Montreal-based companys Asset Health Analyzer (AHA) uses machine learning and a rapid analytics infrastructure to create accurate, site-specific equipment health baselines that enable early detection and diagnosis of maintenance issues.

Other asset health technology may claim to enable early issue detection, but AHA analytics go beyond manual error thresholds and standard AI models. In effect, AHA uses a multi-tiered AI approach with digital twins, which act as virtual companions for the entire equipment fleet. This approach fundamentally transforms a mines preventive maintenance program, letting technicians follow component health on an ongoing basis and examine the exact condition of monitored parts before pulling it down for maintenance.

There are a lot of advantages to embedding digital twins within a multi-layered AI approach, says Amanda Truscott, co-founder and CEO of Rithmik Solutions. Earlier alarms without any threshold setting, insight about whats going wrong, whats about to go wrong, and what went wrong in the past, the ability to prioritize maintenance based on actual equipment health.

AHA uses machine learning to quickly build a contextualized baseline for the best-performing equipment at the mine. It then monitors equipment for any difference from that tuned-in normal state, providing deep and early insights into equipment issues so mines can prevent small problems from escalating. By maintaining models of standard equipment in this way, AHA also allows for cross-asset comparison, highlighting how like assets are similar and how they vary.

Trials of AHA have already shown strong results, providing alarms hours or even days ahead of OEM alerts. In one case, rod-bearing failures on Cat 793Ds were costing a site in Canada $4 million year due to a late OEM warning coming only a few minutes before the failure occurred. AHA was able to find indicators of those failures 10 hours earlier a relative lifetime for maintenance to intervene.

In another recent trial in collaboration with our partner Wencos digital platform, our Asset Health Analyzer rapidly uncovered a customers fleet-wide inefficiency that had gone undetected for multiple years by both the equipment dealer and the mine maintenance team, said Kevin Urbanski, co-founder and CTO of Rithmik Solutions.

What had happened was that temperature regulators failed on 76% of the mines haul truck fleet. Fixing the issue is going to both extend the life of the engines and result in significant fuel savings.

Urbanski says AHA also pulled out previously unknown failure mode indicators on two separate chronic machine issues, which Rithmik and its customer are now using to generate earlier alerts of the failure modes. These insights are also providing a deeper understanding of the total impact of these failure modes on the machine themselves.

Cloud platforms create an ecosystem of partners in mine asset health

Cloud-based platforms are another emerging development in asset health. While digital portals are already common in medicine, entertainment, and enterprise business systems, they are new for mine maintenance.

The concept mirrors existing asset health systems: Sensor data streams to a server, which processes and reports real-time or historical information that maintenance technicians use to understand equipment condition. However, transferring this data to a secure cloud platform instead of an on-premises server opens up many opportunities for mining companies, including access to IIoT and AI-based analysis and stronger collaboration with OEM dealers.

Wenco and Hitachi Construction Machinery (HCM) are currently developing such a cloud-based solution, known as ConSite Mine. Operating on a digital IIoT platform, ConSite Mine remotely aggregates and processes the large volume of data associated with asset health for every installed unit at a mine site, displaying it on a customized dashboard for each customer.

ConSite Mine dashboard. Advanced digital technology helps extend equipment life and improve productivity and safety by providing the information to predict issues, such as visualizing signs of structural cracks. Credit: Hitachi Construction Machinery

Existing asset health systems may also allow customers to monitor equipment health in real time and anticipate issues before they occur, but a cloud solution like ConSite Mine enables the participation of partners outside the walls of the maintenance facility. With ConSiteMine, HCM dealers are able to remotely monitor equipment health in conjunction with their customers, leveraging their expertise and forging a partnership in keeping units running. Dealer technicians supporting their customers can proactively analyze asset health information through the online dashboard, then pre-order parts and schedule planned maintenance avoiding the costs and delays of unplanned downtime from failed equipment.

There are so many opportunities with a digital solution like ConSite Mine, says Dang. For example, the system can detect signs of a pending failure of an excavators hydraulic pump, then let the customer and HCM dealer know well ahead of time. The dealer can check their parts inventory, order a replacement, and schedule the install from their office, taking the pressure off the mines maintenance team.

That one preventive intervention could save the mine $1 million, easy.

Maintenance and operations data can feed into these emerging cloud platforms, enabling mine personnel, dealers, and consultants to investigate root causes, perform failure modes and effects analysis, and contribute to improved policies and structural designs of future equipment. Taking it further, cloud platforms like ConSite Mine are able to integrate services from other OEMs and third parties, creating an ecosystem of partners all working in support of the mines business objectives.

By bringing in OEMs and third parties, maintenance teams arent going it alone anymore, says Dang. They have specialists who are the most knowledgeable people in the world working with them 24/7 to extend their MTBF and reduce downtime.

And, really, its only feasible with the cloud.

Devon Wells is the corporate marketing manager for Wenco International Mining Systems, a Hitachi Construction Machinery group company. To learn more, visit http://www.wencomine.com

Original post:
The future of asset health is in the cloud - Canadian Mining Journal

Read More..

Indias public cloud spending on a roll – ComputerWeekly.com

Indias spending on public cloud services in reached $3.6bn in 2020, as more businesses in the subcontinent turn to cloud computing to ride out the ongoing pandemic, according to IDC.

Much of the growth came in the second half of the year, where revenue from cloud-based infrastructure, platform and applications totalled $1.9bn. The overall Indian public cloud services market is expected to reach $9.5bn by 2025, representing a compound annual growth rate of 21.5%.

Rishu Sharma, principal analyst for cloud and artificial intelligence at IDC India, noted the critical role that public cloud services played for organisations in 2020 as enterprises look to build digital resiliency.

Cloud will become crucial as organisations expedite the development process and deployment of business applications to meet the changing work and business environment, she added.

Cloud-based applications made up the lions share of overall public cloud spending, followed by cloud-based infrastructure and platforms. According to IDC, the top two service providers had 49% of the Indian public cloud services market in 2020.

Even though enterprises in the country have been discussing cloud adoption for the past few years, the Covid-19 pandemic forced enterprises to expedite their cloud strategy. This accelerated cloud adoption in the country by several years, said Harish Krishnakumar, senior market analyst at IDC India.

Businesses started adopting cloud to host a wide array of applications ranging from e-mail servers to many complex systems like data warehousing and advanced analytics. There was also an increased migration of enterprise applications to the cloud, he added.

Indian organisations are already using public cloud services in a big way. Tata Capital, for example, isusing virtual assistants such as Amazon Alexato deliver services, while the National Commodity and Derivatives Exchange has migrated 50 applications to AWS after a fire in 2018.

Indias growing demand for public cloud services has drawn major cloud suppliers to shore up their investments in the country.

In November 2020, Amazon Web Services said it was investing $2.8bn in a second cloud region in India, while Microsoft has teamed up with Indian telecoms giant Jioto deliver cloud infrastructure services through two new datacentres being built in Gujarat and Maharashtra.

Not to be outdone is Google which willopen a Delhi cloud region by 2021, its second one in India since it launched its Mumbai cloud region in 2017.

Google said the new region will enable Indian organisations to take advantage of its big data and infrastructure services onshore while complying with Indias data laws and regulations.

Excerpt from:
Indias public cloud spending on a roll - ComputerWeekly.com

Read More..

Cognitive Cloud Computing Market Next Big Thing : Major Giants Google, 3M, Microsoft The Shotcaller – The Shotcaller

A Latest intelligence report published by AMA Research with title Cognitive Cloud Computing Market Outlook to 2026. A detailed study accumulated to offer Latest insights about acute features of the Global Cognitive Cloud Computing market. This report provides a detailed overview of key factors in the Cognitive Cloud Computing Market and factors such as driver, restraint, past and current trends, regulatory scenarios and technology development. A thorough analysis of these factors including economic slowdown, local & global reforms and COVID-19 Impact has been conducted to determine future growth prospects in the global market.Definition:Cognitive computing is the use of computerized models to simulate the human thought process in complex situations where the answers may be ambiguous and uncertain. The phrase is closely associated with IBMs cognitive computer system, Watson. Cognitive computing overlaps with AI and involves many of the same underlying technologies to power cognitive applications, including expert systems, neural networks, robotics and virtual reality (VR).Major Players in This Report Include,3M (United States), Google LLC (United States), Hewlett Packard Enterprise Development LP (United States), International Business Machines Corporation (United States), Microsoft Corporation (United States), Nuance Communications Inc. (United States), Oracle Corporation (United States), SAP SE (United States), SAS Institute Inc. (United States), Tibco Software Inc. (United States)

Free Sample Report + All Related Graphs & Charts @ : https://www.advancemarketanalytics.com/sample-report/161546-global-cognitive-cloud-computing-marketMarket Trends:

Market Drivers:

Market Opportunities:

The Cognitive Cloud Computing Market segments and Market Data Break Down are illuminated below:by Deployment Type (On-Premise, Cloud), Organization Size (SMEs, Large enterprises), Technology (Natural Language Processing, Machine Learning, Automated Reasoning, Others), Industry Vertical (Healthcare, BFSI, Retail, Government & Defense, IT & Telecom, Energy & Power, Others)

Cognitive Cloud Computing the manufacturing cost structure analysis of the market is based on the core chain structure, engineering process, raw materials and suppliers. The manufacturing plant has been developed for market needs and new technology development. In addition, Cognitive Cloud Computing Market attractiveness according to country, end-user, and other measures is also provided, permitting the reader to gauge the most useful or commercial areas for investments. The study also provides special chapter designed (qualitative) to highlights issues faced by industry players in their production cycle and supply chain. However overall estimates and sizing, various tables and graphs presented in the study gives and impression how big is the impact of COVID.

Enquire for customization in Report @: https://www.advancemarketanalytics.com/enquiry-before-buy/161546-global-cognitive-cloud-computing-market

Geographically World Cognitive Cloud Computing markets can be classified as North America, Europe, Asia Pacific (APAC), Middle East and Africa and Latin America. North America has gained a leading position in the global market and is expected to remain in place for years to come. The growing demand for Cognitive Cloud Computing markets will drive growth in the North American market over the next few years.

In the last section of the report, the companies responsible for increasing the sales in the Cognitive Cloud Computing Market have been presented. These companies have been analyzed in terms of their manufacturing base, basic information, and competitors. In addition, the application and product type introduced by each of these companies also form a key part of this section of the report. The recent enhancements that took place in the global market and their influence on the future growth of the market have also been presented through this study.

Report Highlights:

Strategic Points Covered in Table of Content of Cognitive Cloud Computing Market:

Chapter 1: Introduction, market driving force product Objective of Study and Research Scope the Global Cognitive Cloud Computing market

Chapter 2: Exclusive Summary the basic information of the Global Cognitive Cloud Computing Market.

Chapter 3: Changing Impact on Market Dynamics- Drivers, Trends and Challenges & Opportunities of the Global Cognitive Cloud Computing; Post COVID Analysis

Chapter 4: Presenting the Global Cognitive Cloud Computing Market Factor Analysis, Post COVID Impact Analysis, Porters Five Forces, Supply/Value Chain, PESTEL analysis, Market Entropy, Patent/Trademark Analysis.

Chapter 5: Displaying the by Type, End User and Region/Country 2015-2020

Chapter 6: Evaluating the leading manufacturers of the Global Cognitive Cloud Computing market which consists of its Competitive Landscape, Peer Group Analysis, BCG Matrix & Company Profile

Chapter 7: To evaluate the market by segments, by countries and by Manufacturers/Company with revenue share and sales by key countries in these various regions (2021-2026)

.

Buy this research @ https://www.advancemarketanalytics.com/buy-now?format=1&report=161546

Key questions answered

Thanks for reading this article; you can also get individual chapter wise section or region wise report version like North America, Middle East, Africa, Europe or LATAM, Asia.

Contact US :

Craig Francis (PR & Marketing Manager)AMA Research & Media LLPUnit No. 429, Parsonage Road Edison, NJNew Jersey USA 08837Phone: +1 (206) 317 1218sales@advancemarketanalytics.com

Connect with us athttps://www.linkedin.com/company/advance-market-analyticshttps://www.facebook.com/AMA-Research-Media-LLP-344722399585916https://twitter.com/amareport

View post:
Cognitive Cloud Computing Market Next Big Thing : Major Giants Google, 3M, Microsoft The Shotcaller - The Shotcaller

Read More..

AI, cloud to bring about ‘next generation’ of GAO oversight – Federal News Network

The Government Accountability Office is resolute in its commitment to transforming its oversight through artificial intelligence and cloud systems.

GAOs chief scientist Tim Persons said in an interview with Federal News Network that these emerging capabilities transformed analytics within the agency. Users can now search keywords to yield specific paragraphs, and interactive dashboards enable staff to immerse themselves in different pockets of data.

We have special authorities and access as an agency into just the entire array of federal government problems, Persons said of GAO, as part ofFederal Monthly Insights Cloud and Artificial Intelligence. Were trying to make good government, better government And a lot of that is computing, or converting, questions into answers in the state-of-the-art way.

In the past year, for example, GAO has worked closely with the General Services Administrations Centers of Excellence program to build a cloud infrastructure for better analytics. A cloud-based system, refined in the GAOs Innovation Lab, stands out as one example of technology thats ushering in the next generation of oversight.

We didnt invent analytics by coming up with the Innovation Lab, Persons said on Federal Drive with Tom Temin. It was a different environment sandbox, agile, cloud-based, new tools inclusive of AI.

Another instance of this innovation grew visible within Operation Warp Speed. As the country inched closer to a vaccine rollout, GAO created a data analytics vaccine dashboard using cloud services. The agency took data from different health enterprises, such as the National Institutes of Health, and provided updates in real time to Congress and the White House.

It was nice to have this state-of-the-art, cloud-based, real-time updatable type thing, which we think is a model for, is exciting for, what we can do in the future, Persons said.

In GAOs Innovation Lab, Persons and his team build their technology through a process of reverse engineering. They look at the deficiencies in government oversight, then build capabilities like AI in the cloud to solve those problems.

But GAO wont utilize AI in a way that entirely eliminates human involvement, Persons said. While machine learning has a multitude of applications, Persons envisions a human-centered future in oversight rather than a droid-centered one.

Improving this technology requires constant iterating, though. Persons said the Innovation Lab includes a sandbox environment, where analysts and investigators can experiment in a trial-and-error fashion.

When AI fails not if, [but] when it fails you understand why it fails, and you iterate and fix the problem and then drive toward a better solution, Persons said. Its a different mindset than what often is in the federal government about failure is not an option.

The process of refinement becomes especially important within GAO, which has a higher risk profile around its data than many agencies.

GAO are the stewards of everyone elses data, Persons said. Cloud often sounds like Im just going to dump all this in a data lake its not that at all. Youre going to have a nice, strong data governance system.

These innovations will also transform analytics within GAO to a less hands-on system for chief information officers and other employees. Systems will require less day-to-day management, as Persons said he believes the shift to a cloud system will reduce the agencys on-premise data center footprint.

Converting to cloud, however, also means reskilling the workforce. Already, staff have trained to become more data literate. And in the Innovation Lab, workforce training is as critical as the technology itself, Persons said.

See the article here:
AI, cloud to bring about 'next generation' of GAO oversight - Federal News Network

Read More..

Google Cloud to start hosting some parts of YouTube platform – DIGIT.FYI

Tech firm Google has announced it intends to move some parts of video platform YouTube onto its Google Cloud systems.

YouTube is currently run on internal computer systems held at the tech firms data centres. However, Google said last week it wants to begin moving across to the cloud as it looks to expand further into the cloud-computing market.

Migration would also help the firm to become less reliant on advertisements within searches and on videos.

In an interview with CNBC, Google Cloud CEO Thomas Kurian said: Part of evolving the cloud is having our own services use it more and more, and they are. Parts of YouTube are moving to Google Cloud.

Speaking to the US broadcaster, Kurian was not clear on the timeframe of the move to the Google Cloud platform, the amount of YouTubes data being migrated or what parts would be being transferred.

Google has historically used a hybrid storage system, allowing its data centres to coexist with its cloud platform, and so far has made little attempt to fully migrate its larger properties to its public cloud. Currently, smaller programmes like Waze, Google Workspace and DeepMind use Google cloud infrastructure.

And YouTube is certainly a big platform to start with. Google acquired YouTube in 2006 in a deal worth around $1.65 billion, and it is currently the second-largest website online. The platform boasts a huge number of viewers per month, with current estimates at more than 2 billion.

Googles move to migrate large elements of its empire across to its cloud service now brings it more in line with competitors Amazon and Microsoft, who are both huge players in the cloud computing market.

The cloud is fast becoming a viable option for storage purposes, with other services like Amazon Web Services (AWS) being used by thousands of companies around the world. And the cloud can be massively valuable for firms, particularly during the Covid-19 pandemic when revenue at AWS grew by 32% to $13.5bn.

Google Cloud is now being recognised as a potentially important part of the fintech sector in Scotland, with the announcement in November 2020 that the service has been welcomed by FinTech Scotland into the countrys fintech cluster to help the growth of the countrys SME community.

In January, Edinburgh University became the first in Scotland to announce the migration of its core IT systems to the Oracle Cloud.

The three-phase implementation project was delivered with computer consultancy firm Inoapps, with the first stage of the universitys People and Money programme now live in the Oracle Cloud.

The shift to Cloud-based storage processes will be a key theme at the upcoming Cloud First Virtual Summit, held on 23rd June.

The conference will bring together senior technologists, Cloud architects and business transformation specialists to explore new advancements and best practice.

Register your free place now at:www.cloudfirstsummit.com

Like Loading...

Related

Read more:
Google Cloud to start hosting some parts of YouTube platform - DIGIT.FYI

Read More..