Page 4,001«..1020..4,0004,0014,0024,003..4,0104,020..»

The rise of cloud computing is having an impact on data center efficiency and it’s not great – CIO Dive

Dive Brief:

Uptime Institute's 2019 survey found data centers averaged a PUE of 1.67, versus a PUE of 1.8 in 2011, meaning more of the energy that data centers consume is used for computing processes. That's a significant improvement, but the industry may be losing ground.

The rise of cloud computing is putting downward pressure on data center PUEas fewer computers are now doing the actual work, according to the Uptime Institute, an advisory group focused on business infrastructure. This follows years of improvement.

"Improvements in data center facility energy efficiency have flattened out and even deteriorated slightly in the past two years," according to the firm's annual survey, conducted online in March and April with 1,600 respondents.

Larger and more efficient data centers that power the cloud are doing more of the work now, with efficiency improvements slowing at smaller centers as the computing migrates.

"It's not overall efficiency that is stalled out it's infrastructure efficiency," Matt Stansberry, the institute's VP of North American operations, told Utility Dive. "For a long time, data centers were highly inefficient so, for every unit of energy used to run IT equipment doing productive work, there was a giant amount of overhead."

The PUE ratio "went down over the last 10 years as people started paying attention and making improvements," Stansberry said. Primarily, those improvements were to cooling systems. But with the rise of cloud computing, companies are using fewer computers and instead are relying on cloud-based systems.

"But [data center]buildings don't shift in real-time" to adjust the infrastructure supporting those computers, Stansberry said.

As more computing is done via the cloud, fewer computers mean these data centers may be over-built and less efficient. For utilities, they likely wont see significant load changes in the near term, though older data centers could close and growth in new additions could slow.

Further improvements to data center efficiency "will require significant investment and effort, with increasingly diminishing returns," Uptime Institute said. "While managers and operators should remain vigilant and seek to maintain high facility efficiency levels, higher gains may be found by focusing on IT efficiency."

That doesn't mean there isn't room for more traditional efficiency, according to Jim Kozlowski, VP of global capacity planning and data center operations at Ensono, an IT services provider.

"Utilities help drive data center efficiency," Kozlowski told Utility Dive in an email. "By driving incentives or better economics, data center users will install more energy efficient infrastructure in the long term."

Many companies are in the process of modernizing their data centers, Kozlowski said. "So as they upgrade equipment and building management systems to meet certain standards, energy efficiency is improving."

The federal government has been keeping an eye on the sector as well. In 2016, the U.S. Department of Energy's Advanced Research Projects Agency-Energy offered $25 million for projects and technologies focused on increasing the energy efficiency of data centers.

Read the original post:
The rise of cloud computing is having an impact on data center efficiency and it's not great - CIO Dive

Read More..

NSF Awards Grant to Research Team to Develop Next-Generation Cloud Computing Testbed – HPCwire

RALEIGH, N.C., Dec. 19, 2019 Red Hat, Inc., the worlds leading provider of open source solutions, today announced that the National Science Foundation (NSF) Division of Computer and Network Systems hasawarded a grantto a research team from Boston University, Northeastern University and the University of Massachusetts Amherst (UMass) to help fund the development of a national cloud testbed for research and development of new cloud computing platforms.

By providing capabilities that currently are only available to researchers within a few large commercial cloud providers, the new testbed will allow diverse communities to exploit these technologies, thus democratizing cloud-computing research and allowing increased collaboration between the research and open-source communities, said Michael Zink, Associate Professor, Electrical and Computer Engineering, University of Massachusetts Amherst.

The testbed, known as the Open Cloud Testbed, will integrate capabilities previously developed for the CloudLab testbed into the Massachusetts Open Cloud (MOC), a production cloud developed collaboratively by academia, government, and industry through a partnership anchored at Boston Universitys Hariri Institute for Computing. As a founding industry partner andlong-time collaboratoron the MOC project, Red Hat will work with Northeastern University and UMass, as well as other government and industry collaborators, to build the national testbed on Red Hats open hybrid cloud technologies.

Testbeds such as the one being constructed by the research team, are critical for enabling new cloud technologies and making the services they provide more efficient and accessible to a wider range of scientists focusing on research in computer systems and other sciences.

By combining open source technologies and a production cloud enhanced with programmable hardware through field-programmable gate arrays (FPGAs), the project aims to close a gap in computing capabilities currently available to researchers. As a result, the testbed is expected to help accelerate innovation by enabling greater scale and increased collaboration between research teams and open source communities. Red Hat researchers plan to contribute to active research in the testbed, including a wide range of projects on FPGA hardware tools, middleware, operating systems and security.

Beyond this, the project also aims to identify, attract, educate and retain the next generation of researchers in this field and accelerate technology transfer from academic research to practical use via collaboration with industry partners such as Red Hat.

Since its launch in 2014, Red Hat has served as a core partner of the MOC, which brings together talent and technologies from various academic, government, non-profit, and industry organizations to collaboratively create an open, production-grade public cloud suitable for cutting-edge research and development. The MOCs open cloud stack is based on Red Hat Enterprise Linux, Red Hat OpenStack Platform and Red Hat OpenShift.

Beyond creating the national testbed, the grant will alsoextend Red Hats collaboration with Boston University researchersto develop self-service capabilities for the MOCs cloud resources. For example, via contributions to the OpenStack bare metal provisioning program (Ironic), the collaboration aims to produce production quality Elastic Secure Infrastructure (ESI) software, a key piece to enabling more flexible and secure resource sharing between different datacenter clusters. And by sharing new developments that enable moving resources between bare metal machines and Red Hat OpenStack or Kubernetes clusters in open source communities such as Ironic or Ansible, Red Hat and the MOCs researchers are helping to advance technology well beyond the Open Cloud Testbed.

This testbed will help accelerate innovation in cloud technologies, technologies affecting almost all of computing today, said Michael Zink, associate professor, Electrical and Computer Engineering (ECE), University of Massachusetts Amherst. By providing capabilities that currently are only available to researchers within a few large commercial cloud providers, the new testbed will allow diverse communities to exploit these technologies, thus democratizing cloud-computing research and allowing increased collaboration between the research and open-source communities. We look forward to continuing the collaboration in MOC to see what we can accomplish with the testbed.

About Red Hat

Red Hatis the worlds leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies. Red Hat helps customers integrate new and existing IT applications, develop cloud-native applications, standardize on our industry-leading operating system, and automate, secure, and manage complex environments.Award-winningsupport, training, and consulting services make Red Hat atrusted adviser to the Fortune 500. As a strategic partner to cloud providers, system integrators, application vendors, customers, and open source communities, Red Hat can help organizations prepare for the digital future.

Source: Red Hat

Excerpt from:
NSF Awards Grant to Research Team to Develop Next-Generation Cloud Computing Testbed - HPCwire

Read More..

Data security is the biggest worry in cloud computing – ITProPortal

Despite understanding the advantages that cloud computing brings to the business world, many are still fearful of the technology, a new report by Savoystewart.co.uk argues.

The company polled 6,000 professionals from the UK, US, Germany, France, Italy and Spain to better understand the sentiment towards cloud solutions and concluded that the respondents mostly fear data leaks or unauthorised access to customers stored data.

It also claims that those working in non-tech sectors are most fearful: financial, insurance and healthcare sectors.

Overall, the financial sector is fairly pessimistic when it comes to cloud computing implementation, with the Italians and the Americans being most distrustful.

Those who work in hospitality are mostly in favour of cloud computing, which the reports authors see as good news for an industry which could stand to cut energy, hardware and operational costs through the cloud.

Receiving direct bookings from clients through cloud technology would mean cutting booking agency commission, and increasing profit margins for hotel owners, it argues.

Looking at the UK specifically, it was unveiled that the countrys tech and hospitality sectors are mostly concerned about the availability of cloud computing services. As no company can offer 100 per cent uptime, this represents a problem for businesses which could lose clients during maintenance.

Organisations are fearful of data breaches as they could result in significant losses, loss of business and customer trust, remediation costs and necessary upgrades, as well as penalties and fines by communications watchdogs and other government agencies.

Read this article:
Data security is the biggest worry in cloud computing - ITProPortal

Read More..

Google reportedly set a goal of being a top-two cloud player by 2023 – CNBC

Google CEO Sundar Pichai

Getty Images

In early 2018, top executives at Alphabet debated whether the company should leave the public cloud business, but eventually set a goal of becoming a top-two player by 2023, according to a report from The Information on Tuesday.

If the company fails to achieve this goal, some staffers reportedly believe that Alphabet could withdraw from the market completely.

After the report was published, a Google spokesperson told CNBC the article was "not accurate," and disputed that the company debated leaving the cloud market in 2018.

While Alphabet subsidiary Google is dominant in web search and advertising, the company is still a small player in cloud computing, which involves renting out computing and storage resources to other companies, schools and governments. In 2018 the company lagged Amazon, Microsoft and Alibaba in that market, according to industry research firm Gartner.

Alphabet doesn't break out revenue for the Google cloud business but said in July that it had reached $8 billion in annualized revenue. Amazon Web Services, the market leader, generated $9 billion in revenue during the third quarter alone. Microsoft doesn't specify revenue from its Azure cloud, but Griffin Securities analyst Jay Vleeschhouwer estimated that Azure delivered $4.3 billion in revenue in the third quarter.

Google co-founder Larry Page, who was Alphabet's CEO at the time, reportedly thought being a distant third-place in cloud was not acceptable. But eventually he, CFO Ruth Porat, and then-Google CEO Sundar Pichai decided that Alphabet should remain in the cloud business, according to the report. The company set a five-year budget for capital expenditures of $20 billion, in part to reach that cloud goal.

The company replaced VMware co-founder Diane Greene, who had been leading the cloud business, with Oracle executive Thomas Kurian at the start of 2019. Pichai replaced Page as Alphabet's CEO earlier this month.

Read the full Information article here.

Update: This article has been updated to reflect that on Tuesday afternoon, a Google spokesperson told CNBC the report was "not accurate" and denied the company debated leaving the cloud market in 2018.

WATCH: Google Cloud VP of Retail Carrie Tharp: Cyber Monday is 'make or break'

See the original post:
Google reportedly set a goal of being a top-two cloud player by 2023 - CNBC

Read More..

National Science Foundation Awards Grant to Develop Next-Generation Cloud Computing Testbed Powered by Red Hat – Business Wire

RALEIGH, N.C.--(BUSINESS WIRE)--Red Hat, Inc., the world's leading provider of open source solutions, today announced that the National Science Foundation (NSF) Division of Computer and Network Systems has awarded a grant to a research team from Boston University, Northeastern University and the University of Massachusetts Amherst (UMass) to help fund the development of a national cloud testbed for research and development of new cloud computing platforms.

The testbed, known as the Open Cloud Testbed, will integrate capabilities previously developed for the CloudLab testbed into the Massachusetts Open Cloud (MOC), a production cloud developed collaboratively by academia, government, and industry through a partnership anchored at Boston Universitys Hariri Institute for Computing. As a founding industry partner and long-time collaborator on the MOC project, Red Hat will work with Northeastern University and UMass, as well as other government and industry collaborators, to build the national testbed on Red Hats open hybrid cloud technologies.

Testbeds such as the one being constructed by the research team, are critical for enabling new cloud technologies and making the services they provide more efficient and accessible to a wider range of scientists focusing on research in computer systems and other sciences.

By combining open source technologies and a production cloud enhanced with programmable hardware through field-programmable gate arrays (FPGAs), the project aims to close a gap in computing capabilities currently available to researchers. As a result, the testbed is expected to help accelerate innovation by enabling greater scale and increased collaboration between research teams and open source communities. Red Hat researchers plan to contribute to active research in the testbed, including a wide range of projects on FPGA hardware tools, middleware, operating systems and security.

Beyond this, the project also aims to identify, attract, educate and retain the next generation of researchers in this field and accelerate technology transfer from academic research to practical use via collaboration with industry partners such as Red Hat.

Since its launch in 2014, Red Hat has served as a core partner of the MOC, which brings together talent and technologies from various academic, government, non-profit, and industry organizations to collaboratively create an open, production-grade public cloud suitable for cutting-edge research and development. The MOCs open cloud stack is based on Red Hat Enterprise Linux, Red Hat OpenStack Platform and Red Hat OpenShift.

Beyond creating the national testbed, the grant will also extend Red Hats collaboration with Boston University researchers to develop self-service capabilities for the MOCs cloud resources. For example, via contributions to the OpenStack bare metal provisioning program (Ironic), the collaboration aims to produce production quality Elastic Secure Infrastructure (ESI) software, a key piece to enabling more flexible and secure resource sharing between different datacenter clusters. And by sharing new developments that enable moving resources between bare metal machines and Red Hat OpenStack or Kubernetes clusters in open source communities such as Ironic or Ansible, Red Hat and the MOCs researchers are helping to advance technology well beyond the Open Cloud Testbed.

Supporting Quotes

Michael Zink, associate professor, Electrical and Computer Engineering (ECE), University of Massachusetts AmherstThis testbed will help accelerate innovation in cloud technologies, technologies affecting almost all of computing today. By providing capabilities that currently are only available to researchers within a few large commercial cloud providers, the new testbed will allow diverse communities to exploit these technologies, thus democratizing cloud-computing research and allowing increased collaboration between the research and open-source communities. We look forward to continuing the collaboration in MOC to see what we can accomplish with the testbed.

Orran Krieger, professor of Electrical and Computer Engineering, Boston University; co-director, Red Hat Collaboratory; PI, Massachusetts Open CloudAn important part of the MOC has always been to enable cloud computing research by the academic community. This project dramatically expands our ability to support researchers both by providing much richer capabilities and by expanding from a regional to a national community of researchers.

Chris Wright, senior vice president and chief technology officer, Red HatThis grant and the work being done by the MOC show how open source solutions can positively impact real-world challenges outside of enterprise data centers. Red Hat is no stranger to pioneering new ways in which open source software can be used for innovative research, and we are pleased to help drive this initiative in bringing open cloud technologies to a wider range of disciplines, from social sciences to physics, while also continuing our commitment to the next generation of open source practitioners.

Additional Resource

Connect with Red Hat

About Red Hat, Inc.

Red Hat is the worlds leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies. Red Hat helps customers integrate new and existing IT applications, develop cloud-native applications, standardize on our industry-leading operating system, and automate, secure, and manage complex environments. Award-winning support, training, and consulting services make Red Hat a trusted adviser to the Fortune 500. As a strategic partner to cloud providers, system integrators, application vendors, customers, and open source communities, Red Hat can help organizations prepare for the digital future.

Forward-Looking Statements

Certain statements contained in this press release may constitute "forward-looking statements" within the meaning of the Private Securities Litigation Reform Act of 1995. Forward-looking statements provide current expectations of future events based on certain assumptions and include any statement that does not directly relate to any historical or current fact. Actual results may differ materially from those indicated by such forward-looking statements as a result of various important factors, including: risks related to the ability of the Company to compete effectively; the ability to deliver and stimulate demand for new products and technological innovations on a timely basis; delays or reductions in information technology spending; the integration of acquisitions and the ability to market successfully acquired technologies and products; risks related to errors or defects in our offerings and third-party products upon which our offerings depend; risks related to the security of our offerings and other data security vulnerabilities; fluctuations in exchange rates; changes in and a dependence on key personnel; the effects of industry consolidation; uncertainty and adverse results in litigation and related settlements; the inability to adequately protect Company intellectual property and the potential for infringement or breach of license claims of or relating to third party intellectual property; the ability to meet financial and operational challenges encountered in our international operations; and ineffective management of, and control over, the Company's growth and international operations, as well as other factors. In addition to these factors, actual future performance, outcomes, and results may differ materially because of more general factors including (without limitation) general industry and market conditions and growth rates, economic and political conditions, governmental and public policy changes and the impact of natural disasters such as earthquakes and floods. The forward-looking statements included in this press release represent the Company's views as of the date of this press release and these views could change. However, while the Company may elect to update these forward-looking statements at some point in the future, the Company specifically disclaims any obligation to do so. These forward-looking statements should not be relied upon as representing the Company's views as of any date subsequent to the date of this press release.

Red Hat, Red Hat Enterprise Linux, the Red Hat logo, and OpenShift are trademarks or registered trademarks of Red Hat, Inc. or its subsidiaries in the U.S. and other countries. Linux is the registered trademark of Linus Torvalds in the U.S. and other countries. The OpenStack Word Mark is either a registered trademark/service mark or trademark/service mark of the OpenStack Foundation, in the United States and other countries, and is used with the OpenStack Foundation's permission. Red Hat is not affiliated with, endorsed or sponsored by the OpenStack Foundation, or the OpenStack community.

View post:
National Science Foundation Awards Grant to Develop Next-Generation Cloud Computing Testbed Powered by Red Hat - Business Wire

Read More..

2020 vision: Y-Soft print industry predictions – Gigabit Magazine – Technology News, Magazine and Website

2019 has been a year of profound change as organisations across the globe have recognised the importance of digital transformation. As a result, more businesses have started experimenting and deploying new smart tools and workflow solutions which in return have enabled rapid information sharing, increased productivity and automation of some of the more mundane tasks. According to Y Soft Corporation, companies have recognised how Cloud has become a crucial part of their companys print IT strategy and this transformation is set to go even further with the advent of Edge computing disrupting the print sector.

In light of this transformation, Ross Penman, Head of Global Delivery Management at Y Soft, notes the following key predictions for the year ahead:

1. Cloud and Edge computing

While cloud adoption has been around for several years now, plenty of companies still struggle to move into the cloud completely because of old legacy technology and hardware. However, in a world of increased remote and flexible working, cloud adoption will not only help companies to improve their overall productivity, but it can also help companies to improve costs and risk management.

In the next 12 months more companies will see a cloud approach as a crucial part of their business strategy. By prioritising models such as Software-as-a-Service and Platform-as-a-Service companies can begin to consume applications without having to invest in skills to build solutions themselves from the ground up. Following on from this, more companies will also begin exploring Edge computing as part of the print setup to solve issues that Cloud can introduce, such as latency and bandwidth costs, which can, in turn, impact productivity and efficiency.

2. Green credentials

Sustainable lifestyles have become increasingly important not only for individuals but for organisations no matter how large or small they might be. Green movements such as Extinction Rebellion have created a huge political movement that has made climate change more imperative. As a result, individuals hold organisations more accountable than ever before, not only for their consumption but also for their contribution to the environment. Organisations are expected to improve their green credentials and do more to achieve energy efficiency goals. For most companies, print IT will play a huge role in this and organisations are expected to report on elements such as how many trees are used to print as well as report on their water and energy consumption. As a result, more companies will adopt automated scan workflows, which transform paper-based work processes into digital workflows, helping companies to keep track of their consumption and be more productive, while focusing on their core everyday tasks.

3. Security and identity

While security has always played a crucial role for IT staff, the huge amount of high-profile data breaches in recent years have made businesses more security-conscious than ever before. As companies continue to adopt smarter and integrated workflow solutions, the security of the entire system must be considered as a whole. Unfortunately, too often the security of the print IT is overlooked. However, print hacks such as the PiewDiePie incident showed that organisations can never play it safe.

SEE ALSO:

Therefore, better collaboration between an enterprise solution provider, the MFD service provider and an organisations IT department is crucial. In addition, companies that adopt a good identity management will not only improve their overall security but will also simplify the complexity of managing multiple sets of credentials. This is especially important as more people within organisations adopt flexible and remote working.

4. Reseller and customer agility

Customers have recognised that having one integrated supplier across all their hardware and software solutions will not only reduce overall costs but also provide a smoother and much more integrated experience. It is important that resellers should look to make software that supports this demand and integrate with key manufacturers in the tech space. In addition, resellers should harness new technologies to refurbish old legacy technology rather than try to completely replace them.

5. Artificial intelligence

While many predicted at the beginning of 2019 that artificial intelligence and robotics would eliminate a huge number of jobs by 2020, there is now a consensus that the increased efficiencies delivered through these technologies could actually result in more jobs. In the next 12 months, more companies will experiment with AI and robotics to run security and quality assurance tests on software and hardware solutions. This in return will help companies to detect security attacks, programme mistakes, and viruses much more quickly in addition to providing higher quality products and services.

Link:
2020 vision: Y-Soft print industry predictions - Gigabit Magazine - Technology News, Magazine and Website

Read More..

Amazon used Bluetooth beacons to track attendees at its massive AWS cloud computing conference – Business Insider UK

Amazon CEO Jeff Bezos is an engineer at heart and built a company that gathers data on everything it can. "People think of Amazon as very data-oriented and I always tell them, look, if you can make the decision with data, make the decision with data," Bezos once said in 2018

So it should not be surprising that when 60,000 people descended on Las Vegas earlier this month to attend Amazon Web Services re:Invent, the cloud platform's biggest annual customer conference, Amazon wanted data on what these attendees did and where they went during the show.

Specifically, the lanyards for the conference badges that Amazon issued this year included a Bluetooth beacon from a company called TurnoutNow, AWS confirmed to Motherboard's Joseph Cox.

Not everyone was aware that the badge included such tracking information, sources told Motherboard. But many were. On Twitter, people were discussing signs that AWS placed around the registration area alerting people to the tracker.

Per posts on social media, the signs said:

"Notice that object on your lanyard? It's an anonymous beacon that lets us count the number of attendees at certain event locations, so we can facilitate foot traffic, improve transportation, and help plan future events. Your beacon is not associated with your name or any other personal information about you, and it will only send anonymous data to us about our meeting space and other central gathering locations for this event. If you'd rather have a lanyard without the beacon, please visit the help desk."

Amazon also told Business Insider that attendees were free to opt out, and that accepting the lanyard was not a requirement for admittance.

Even still, given this was a conference for IT professionals, many of whom are computer security experts, a few people complained about the tracking device on Twitter, including security pro Jerry Gamblin.

Gamblin pointed out in a tweet that if some hacker figured out how to match the lanyard tracker with the name on the badge, they would no longer be anonymous. "The beacon has a unique ID so it is not anonymous and at best is pseudonymous. It would only take matching of name to the unique ID for full tracking," he tweeted.

And attendee Rachel Dines dissected hers and posted a picture on Twitter. "Did anyone else take apart the beacon on their #reinvent lanyard? I see a battery, an antennae and some unknown chips. Also, as soon as you remove the beacon from your lanyard, the QR code starts to fade #creepy"

The beacon was the subject of other jokes and banter, from opting out with a hammer to swapping lanyards with other attendees.

Interestingly, such conference attendee tracking devices are hardly a new idea. They've been used by conference organizers for well over a decade.

And it's a bit ironic that Amazon, of all companies, wouldn't use one of the cutting edge technologies it develops and sells to solve this problem, like machine learning/AI, computer vision, or even something more mundane, like the location data from its conference mobile app.

As one attendee on Twitter put it, "2008 wants their conference badges back."

Are you an insider with insight to share? Contact Julie Bort on encrypted chat app Signal at (970) 430-6112 using a non-work phone (no PR inquiries, please), or email at jbort@businessinsider.com. Open DMs on Twitter @Julie188.

See the rest here:
Amazon used Bluetooth beacons to track attendees at its massive AWS cloud computing conference - Business Insider UK

Read More..

Solving the Data Explosion with Fog and Edge Computing – CDOTrends

As the number of IoT devices continues to increase a predicted 75 billion by 2025 to be exact so do data requirements. In fact, its estimated that IoT will generate more than 500 zettabytes of data per year by the end of 2019.

To create an environment where IoT devices and applications are seamlessly connected to one another, and their end-users, sufficient computational and storage resources are needed to perform advanced analytics and machine learning, which the cloud is capable of doing. However, cloud servers are often located too far away from the IoT endpoints to be able to effectively transmit time-sensitive data to and from billions of "things" across vast distances. This has driven the move towards edge and fog computing.

Living Life on the Edge

Edge computing allows for data to be processed closer to its origination, significantly reducing network latency. By physically bringing processing closer to the data source (such as IoT devices), there's less distance that data needs to be sent across, improving the speed and performance of devices and applications. However, there are limitations with undertakings, like real-time analysis and machine learning.

Edge computing has led the way for the emergence of fog computing a term first coined by Cisco to signify decentralized computing architecture that acts as an extension of cloud computing. The storage and computing of data are distributed most logically and efficiently, located between the cloud and the data source. Fog computing is seen as a complementary strategy for how edge computing can be effectively implemented while providing the compute, network, and storage capabilities of the cloud. It is estimated that the revenue produced by the fog computing market will increase by 55% between 2019 and 2026.

Seeing Through the Mist

When broken down, fog computing was created to accompany edge strategies and serve as an additional architectural layer to provide enhanced processing capabilities that the edge alone cannot always do. There are many similarities between fog computing and edge computing, such as that they both bring processing closer to the data source. However, the main difference between the two is where the processing is taking place.

While fog computing offers many of the same advantages as the cloud, the cloud has limitations such as being centralized and located further away from the data source, thus increasing latency and limiting bandwidth. It's not always practical to transmit vast amounts of data all the way to the cloud and back again, especially for scenarios when processing and storage on a cloud-scale are not necessary.

Solving the Data Problem

Digital transformation means something different to every business. Meeting these new transformation challenges is forcing organizations to reconcile new architectural paradigms. For example, a highly centralized architecture often proves to be problematic as there is less control over how organizations can connect to their network service providers and end-users, ultimately causing inefficiencies in their IT strategies. At the same time, however, solely relying on small, "near edge" data centers could become expensive, putting constraints on capacity and processing workloads, and potentially creating limitations on bandwidth.

Increasingly were seeing organizations look to multi-tenant data centers to better support distributed architectures. Its best to think of IT infrastructure in terms of layers. The first layer consists of enterprise core data and applications, where intellectual property, high-density computing, and machine learning can live. From there, organizations can continue to add layers such as cloud computing services, distributed multi-site colocation, and 5G aggregation as part of an edge delivery platform. Through a multi-tier distributed architecture, organizations will gain control over adding capacity, network, compute, storage, and shortening distances between your workloads and end-users. Ultimately this enhances performance and promotes an improved data exchange.

Rod Glover, data centre operations director, Australia at Digital Realtywrote this article.

The views and opinions expressed in this article are those of the author and do not necessarily reflect those of CDOTrends.

See the original post:
Solving the Data Explosion with Fog and Edge Computing - CDOTrends

Read More..

The true value of cloud-based AI isnt what you think – InfoWorld

Artificial intelligence is one of those concepts that was hot in the 80s, kind of went away, and now is red hot. Most point to AIs new features and functions to explain its growing popularity, but its actually because public cloud computing has made it affordable. For a few hundred bucks a month you can have some pretty heavy-duty AI systems in place that would have cost millions 10 to 15 years ago.

However, integrating AI with applications, such as banking, medical, manufacturing, and other systems, is actually not where were finding the value of cloud-based AI. Its perhaps the most misunderstood aspect of the value of AInow, as well as in the future.

Im talking about AI engines integrated with cloud-based and cloud-oriented management, monitoring, and self-healing services that now take advantage of AI and machine learning.

Those who sell AIops tools these days, especially where AI powers cloudops systems, understand this. Those who buy cloud-based technology, and currently are transferring core systems to public clouds, often dont. Thus, the end-state cloudops systems and processes are not as valuable as they could be. Whats missing is AI and machine learning.

The points of value are clear to me, including:

The capability of self-healing. AI-based cloudops are capable of learning how things are fixed through matching problem patterns with solution patterns over time. After a while, they can do so automatically and better than humans can. This type of automation removes people from having to fix ongoing minor and major issues and increases reliability. As the cloudops knowledge engines become more experienced they get much better over time.

Better defense of cloud-based data and applications. Security and AI have long been two concepts related to each other in theory, but often not understood by either AI or security experts. Indeed, AI can allow secops systems to become proactive and learn as they go what constitutes a breach attempt, and how to defend against it.

Opportunities for sharing knowledge. An operationally oriented AI system has a great deal of value but has to learn things over time, which is fundamental to cognitive computing. What if knowledge could be shared in real time? In essence youd have a smart ops system from day one, benefiting from collective learning and knowledge. This is going to be a larger push in the near future.

The reality is that AI is one of those things that we tend to glamorize. Although we think of science fiction depictions of AI systems, their daily value is more pragmatic and less dramatic.

Link:
The true value of cloud-based AI isnt what you think - InfoWorld

Read More..

Raleigh analytics firm Cymatic named most promising startup finalist in The Cloud Awards – WRAL Tech Wire

RALEIGH Cymatic,a Raleigh-based provider of user and entity behavior analytics, today announced that is has been names a finalist as the 2019-2020 Most Promising Startup in the international Cloud Computing Awards program, The Cloud Awards.

To be shortlisted for our revolutionary work in web application defense is not only an honor, but a clear recognition of our early success in leading secure cloud technologies, said Cymatic Founder and Chief ExecutiveJason Hollander, in a statement. We offer the only unified WAF that deploys at the client through a simple line of JavaScript without agents or proxies to deliver first-look, first-strike capability that is earliest in the kill chaina fundamental shift from traditional approaches to security.

Founded in 2011, Cymatic has developed a next-generationall-in-one web application defense platformthat moves protection from the network side to the client to defend against todays most sophisticated client and browser-based attacks.

The startup says it is the only UEBA platform to provide web applications with deep visibility into and proactive remediation of the threats from human and non-human attacks, as well as the vulnerabilities users bring with them on their devices.Instead of just protecting network-based threats like traditional WAFs, Cymatic said it uses sophisticated artificial intelligence and machine-learning algorithms to identify page mutations and user anomalies. The platform protects against user-derived and device-based threats such as poor credential hygiene, dark web vulnerabilities and potentially risky devices. It is invisible and frictionless to users, deploys in mere minutes and has immediate time-to-value.

Web applications continue to be highly vulnerable to human derived threats such as poor credential and device hygiene. Unfortunately, todays current security solutions fail at really understanding how a users security hygiene directly affects the cyberhealth of companies they interact with, said Hollander.

Back in August, the firm raised $4.5 million in seed fundingfrom prominent private angel investors.

Cymatic was selected from hundreds of organizations that entered from the Americas, Australia, Europe and the Middle East. Its advanced cloud-based microservices and real-time message-bus architecture offer unparalleled scale to provide the resilience necessary to validate and process the millions of users and transactions that touch web properties every second. The platform is engineered to work across all web applications regardless of operating system, browser or device. It eliminates the OWASP top 10, bots, CAPTCHAs, dark web threats, forced MFA, shared accounts, IP threats, device vulnerabilities and other cloud-based threats with no erosion of the user experience.

The Cloud Awards is an international program which recognizes and honors industry leaders, innovators and organizational transformation in cloud computing. The awards are open to large, small, established and start-up organizations from across the entire globe, with an aim to find and celebrate the pioneers who will shape the future of the Cloud as we move into 2020 and beyond. The Cloud Awards currently offers two awards programs, the Cloud Computing Awards and the Software-as-a-Service Awards.

Raleigh analytics firm Cymatic raises $4.5M in seed funding

View post:
Raleigh analytics firm Cymatic named most promising startup finalist in The Cloud Awards - WRAL Tech Wire

Read More..