Category Archives: Cloud Servers

Cloudera Looks Like A Hidden Gem In The World Of Data Cloudification – Forbes

Rob Bearden, Cloudera CEO

Data never dies. Its lifecycle extends from creation through a never-ending stream of business cycles.At the same time, enterprises are swimming in data that ranges broadly from customer-generated information, financial transactions, edge-generated data, and even operational IT server logs.

Understanding how to mine an organizations data for maximum impact, securely and in compliance with a vast and growing array of regulatory frameworks, is one of the most significant challenges facing nearly every CIO across every industry today.This is a challenge that is expanding, as the volume of enterprise data is estimated to be doubling every 18 to 24 months, with no sign of slowing down.

Complicating matters is the explosion of cloud-based architectures. Between edge, on-prem, public cloud, private cloud, and the rapidly expanding array of on-demand and as-a-service offerings, things can become confusing fast. It truly is a multi-cloud world in which data can live nearly anywhere. Understanding how to wrangle that data is paramount.

Its at this intersection of cloud solutions that Cloudera lives and delivers value. I believe its Cloudera Data Platform (CDP) is the hidden gem of cloudification. Clouderas fundamental goal is to help organizations get a handle on the data to allow them to manage it effectively, wherever it may reside.

I recently had the pleasure of talking with Clouderas newly minted CEO, Rob Bearden. He talked to me about the challenges facing CIOs as they navigate this world, while also attempting to break down the silos between business units that are created by segregated and unmanaged pools of data.

Focused on multi and hybrid cloud

There was a time, during the early days of cloud, where it seemed as if the public cloud was the answer to all of IT's woes. Public cloud provided a simple model delivered without the burdens of capital expenditures that often plague data center expansions. An IT administrator could push a button, and a server would suddenly appear.The pitfalls began to emerge quickly.Data ended up where it was impractical to utilize and expensive to move around. The phrase "data gravity" began to take hold in nodding acceptance of the underlying challenges of placing data where it belonged.

The real answer, as IT discovered, is not to limit yourself. Options are good. Place data and workloads where they make both the most practical and most operational sense. This has given us the hybrid multi-cloud world that we find ourselves in today.Its also given us continued challenges in understanding and managing the data that lives across the various environments.

Clouderas vision is to build a modern data architecture to help organizations derive very fast time-to-value from their data.Data management begins at the point of creation, which is why the companys solutions begin at the edge, where data is collected from multiple streams, and extend to the where it is ultimately consumed which could be the public cloud, private cloud, or in an enterprise's own on-prem data center.

Managing data from cradle-to-grave allows Clouderas platform to break down the barriers between business units that exist in every enterprise and expose an organizations data to where it makes the most sense. Data delivers value, after all, when it has real-time visibility and applicability across the enterprise.

The common wisdom is that unstructured data is the bulk of an organizations data, with unstructured data expected to comprise nearly 80% of all data within the next half-decade.Managing that data requires understanding what that data is so that it can be consistently and transparently secured and managed within a growing universe of regulatory requirements. Metadata must be normalized and cataloged, and data must be placed where it makes the most sense from both economic and operational considerations.

Delivering these capabilities is a formidable challenge, yet one that I believe Cloudera is successfully taking on. The benefits of bringing Clouderas solution into an enterprise are quickly realized, which is why it is no surprise that the company has over 400,000 servers under management, providing insight into more than five exabytes of data.

A powerful merger

Cloudera has a long history of understanding how to leverage AI and machine learning to help enterprises manage contained data.As the world evolved, and edge computing became a more significant part of the equation, it was clear to Cloudera that any enterprise data strategy must span edge to cloud. Data must be managed from its point of origination to where it comes to rest.

Hortonworks, a leader in big-data solutions, thought about data management in the same way. In January of last year, the two joined together under the Cloudera name. Clouderas traditional offerings of AI-driven data management with Hortonworks expertise in managing data from its point of inception make the combined organization a powerhouse in data management that is nearly unmatched in the industry.

The vision for the combined company, Rob told me, is to provide the tools to manage an enterprises data cloud with a platform that is natively multi-cloud and multifunction. Data is managed from creation through its entire lifecycle, all with consistent security and governance, and managed through a single pane of glass.That's no small challenge.

Rob said that all of the tactical and organizational elements of the merger are now complete. The two companies today execute as a single team, with fully integrated roadmaps.

Illustrating just how smoothly the integration was, note that in just 9 months Cloudera delivered its Cloudera Data Platform on public cloud, including Cloudera Data Warehouse and Cloudera Machine Learning services. Its available on AWS and Microsoft Azure today.

Beyond its existing solutions bringing together data management for edge and cloud, its on-going focus is centered on delivering private cloud capabilities to its CDP products. These features will be in beta soon, with a general availability happening sometime this summer.

A new CEO, focused on execution

One of my favorite parts of my job is getting to spend time with CEOs of great technology companies.It helps me understand the companys vision while also allowing me to gauge the passion behind the products.

Meeting Rob Bearden is an excellent example of this. Rob is genuinely excited about the challenges facing both Cloudera, and in how Cloudera is helping CIOs face the mounting challenges of a mountain of data.

Rob was named CEO of Cloudera about a month ago, and he told me that hes a little bit grateful that he came in after the hard work of the integration between traditional Cloudera and Hortonworks was complete. This allows him to immediately focus on execution, delivering to the promise of the vision.

Clouderas business is built on a total addressable market worth nearly $26B today, doubling to $52B within the next three years. That a huge market opportunity.

Rob believes, and made me believe, that Cloudera has the right vision and right sets of technology to ride the growth of hybrid-cloud moving forward.Without giving enterprise organizations the ability to manage its data in this complex world of mixed solutions, Digital Transformation can't happen.

Clouderas future

There is no question that managing data is one of the most challenging problems facing both enterprise IT and the business units that those IT organizations support.The rise of edge, which is only accelerating as 5G deploys and enables new capabilities, brings new challenges in providing the capabilities required for cataloging, security, compliance, and placement.

The integration of the various private and public cloud offerings, with on-prem architectures and the emergence of on-demand and as-a-service offerings, makes data placement and management seem nearly insurmountable.

Theres more than one solution on the market to help enterprises manage data. There are fewer that look across the horizon, providing a consistent experience and set of capabilities from edge to cloud. I believe the combination of Clouderas technologies and its expanded portfolio puts Cloudera on a perch where it poised to dominate the space.

I recognize this is aggressive, but I cant help but to see some of the similarities between Cloudera and Microsoft when Nadella took the helm. If you recall, when Nadella took over for Ballmer, Microsoft was very much an on-prem company. Sure, Azure existed, but it was in its infancy and on-prem was 95% of the revenue. In a very similar way, Cloudera is very successful on-prem and is moving rapidly to a hybrid and multi-cloud model. Trust is another aspect. The reasons many choose Azure and Azure Stack is that the company is very trusted. Cloudera, with its management of so much data in very regulated industries, is a very trusted company. Anyways, I wanted to share that compare.

I asked Rob why he stepped up to be CEO . He told me that its Clouderas opportunity to define how data will be managed over the next 15-20 years that excites him. I believe the company is poised to influence a generation of IT. He plainly states that the opportunity is Clouderas to lose. That's what gets him out of bed in the morning.

Note: Moor Insights & Strategy data and storage analyst Steve McDowell contributed to this article.

Disclosure: Moor Insights & Strategy, like all research and analyst firms, provides or has provided paid research, analysis, advising, or consulting to many high-tech companies in the industry, including Amazon.com, Advanced Micro Devices, Apstra,ARM Holdings, Aruba Networks, AWS, A-10 Strategies, Bitfusion,Cisco Systems, Cloudera, Dell, DellEMC, Dell Technologies, Diablo Technologies, Digital Optics, Dreamchain, Echelon, Ericsson, Foxconn, Frame, Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries,Google,HPInc., Hewlett Packard Enterprise, HuaweiTechnologies,IBM, Intel, Interdigital, Jabil Circuit, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, MACOM (Applied Micro), MapBox, Mavenir, Mesosphere,Microsoft,National Instruments, NetApp, NOKIA, Nortek,NVIDIA, ON Semiconductor, ONUG, OpenStack Foundation, Panasas, Peraso, Pixelworks, Plume Design, Portworx, Pure Storage,Qualcomm, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Samsung Electronics, Silver Peak, SONY, Springpath, Sprint, Stratus Technologies, Symantec, Synaptics, Syniverse, TensTorrent, Tobii Technology, Twitter, Unity Technologies, Verizon Communications,Vidyo, Wave Computing, Wellsmith, Xilinx, Zebra, which may be cited in this article.

Link:
Cloudera Looks Like A Hidden Gem In The World Of Data Cloudification - Forbes

Rootkit in the Cloud: Hacker Group Breaches AWS Servers – Computer Business Review

Add to favorites

Seems like a hell of a lot of effort; it must have been a target of real interest

A sophisticated hacker group pwned Amazon Web Services (AWS) servers, set up a rootkit that let them remotely control servers, then merrily funnelled sensitive corporate data home to its command and control (C2) servers from a range of compromised Windows and Linux machines inside an AWS data centre.

Thats according to a report from the UKs Sophos published late last week, which has raised eyebrows and questions in the security industry. The attackers neatly sidestepped AWS security groups (SGs); which, when correctly configured, act as a security perimeter for associated Amazon EC2 instances.

The unnamed target of this attack had correctly tuned their SGs. But with a rootkit installed on their AWS servers that gave attackers remote access, the compromised Linux system was still listening for inbound connections on ports 2080/TCP and 2053/TCP: something that eventually triggered Sophos intervention.

Sophos was at pains to emphasise that while this particular attack targeted AWS servers, it was not an AWS problem per se. It represents a method of piggybacking C2 traffic on a legitimate traffic in a way that can bypass many, if not most, firewalls.

Security experts agreed that the attacker, likely a nation state actor, could have used the bespoke rootkit to funnel data off most servers, whether in the cloud or on-premises. (Those interested in the precise details of how the attackers managed their data exfiltration can refer to detailed technical write-up here).

Sophos dubbed the incident, which used a customised Gh0st RAT trojan Cloud Snooper.One cybersecurity researcher (initial reaction: dude this happens all the time. It only gets noticed if it has a fancy name) described it to us after looking closely at the incident as from a technical perspective, a thing of beauty

Many questions about the security breach remain unanswered, however, not least how the attackers got the rootkit onto the AWS servers to start with.

Sophos said: An analysis of this system revealed the presence of a rootkit that granted the malwares operators the ability to remotely control the server through the AWS SGs. But this rootkits capabilities are not limited to doing this in the Amazon cloud: It also could be used to communicate with, and remotely control, malware on any server behind any boundary firewall, even an on-premises server.

By unwinding other elements of this attack, we further identified other Linux hosts, infected with the same or a similar rootkit.

The company added: Finally, we identified a compromised Windows system with a backdoor that communicated with a similar C2 as other compromised Linux hosts, using a very similar configuration format. The backdoor is apparently based on source code of the infamous Gh0st RAT malware.

At the heart of the attack was another backdoor trojan dubbed snoopy that can be executed both as a command line tool and as a daemon. This opens HTTP and/or DNS services on a compromised system, and allows traffic tunneling, operating both as a reverse SOCKS5 proxy server, and client.

(Snoopy stores many debug messages in clear text, several in Chinese, i.e. ! Remote memory space allocation failed!)

Sophoss full write-up of the techniques used can be found here [pdf].

The security firm noted: AWS SGs provide a robust boundary firewall for EC2 instances. However, this firewall does not eliminate the need for network administrators to keep all external-facing services fully patched.

The default installation for the SSH server also needs extra steps to harden it against attacks, turning it into a rock-solid communication daemon.

Security researcher Willem Mouton told Computer Business Review: From a technical perspective it is a thing of beauty, also the fact that they made it cross platform.

The one thing that the article did not clear up was what the initial entry vector as well as the privacy escalation was. In order to install such a rootkit you would probably [need] root on Linux and LocalAdmin/System level privileges on Windows.

This rootkit was most probably deployed to maintain an advanced covert level of network persistence. Which makes me wonder on whose network they found this because that seems like a hell of a lot of tech and effort so it must have been a target of real interest. Also, the article mentions everything was hosted on AWS, and usually you would see attackers go for the AWS/Cloud tenancy or subscription to maintain access, but again nothing of that was mentioned.

I would love to see the full outcome of their investigation.

Sophos said Indicators of Compromise (IoCs) included having the following ports open on local host: tcp 2080; udp 2053; tcp 10443. Suspect file names include /tmp/rrtserver-lock; /proc/sys/rrootkit; /tmp/rrtkernel.ko; /usr/bin/snd_floppy; snd_floppy.

The following warning syslog messages also showed up:

One high profile previous attack on cloud servers was demonstrated by Eclypsium, which leased a bare metal IBM server and exploited a vulnerability in its Baseboard Management Controller (BMC); a third-party server component used to enable remote management for initial provisioning, OS reinstall and troubleshooting.

It then relinquished the use of the server, which was re-released for use by other cloud customers. Butthe BMC was not re-flashed with factory firmware meaning Eclypsium sustained its access, in an incident that IBM Cloud played down.

See more here:
Rootkit in the Cloud: Hacker Group Breaches AWS Servers - Computer Business Review

Amazon, Microsoft cloud-computing can weather a recession and coronavirus, analysts say – Seattle Times

In the decade since the Great Recession, cloud computing became the de facto information-technology strategy for startup companies and, increasingly, large corporations alike. The business of renting remote computing power has grown into an enormous industry and, with No. 1 player Amazon and No. 2 Microsoft based in the Seattle area, a mainstay of the regions broader tech-driven economy.

As fears of a recession mount with the spread of the novel coronavirus, cloud analysts are considering how this $263 billion industry would fare in its first significant economic downturn since reaching maturity. The short answer: fairly well, especially for the market leaders.

Among the digital giants, nobodys scaling back for a blip, said John-David Lovelock, chief forecaster with research and advisory firm Gartner, which expects global public cloud-services revenue to increase 33% to more than $350 billion by 2022.

You dont build a cloud provider of the scale were talking about here without a plan to do it that spans decades, said Corey Quinn, cloud economist with The Duckbill Group. It transcends the boundaries of any individual economic cycle.

Even in the event of a severe global recession, there are reasons to expect cloud computing which fundamentally changed the information-technology business model would continue to grow. Thats what happened during the last recession, when the technology was still nascent.

Cloud computing is a fast-growing business at both Redmond-based Microsoft and Seattle-based Amazon, where it balances the thinner profit margins of the retail side of the company. Cloud competitors including Google and IBM also have major engineering offices in the region.

In thequarter ended Dec. 31, Microsoft reported sales of $11.9 billion in the business segment that includes its Azure cloud computing business, lumped in with its traditional server software and business consulting services. The company said Azure sales increased 62% from a year earlier, though it doesnt disclose the revenue figure. Amazon Web Services (AWS) reported revenue of nearly $10 billion in the same period, up 34%.

Cloud services companies allow customers to rent remote computing power, scaling up and down usage, and associated costs, as needed. The cloud has steadily replaced the old model of organizations building and owning their own servers and data centers, which takes time, requires large up-front capital outlays as well as ongoing maintenance costs, and leaves them with excess computing capacity that goes unused except during brief periods of peak demand.

A business running on the cloud that experiences a spike in customer traffic to its website can immediately call on servers in a global network of Amazon or Microsoft data centers to handle the load. When the traffic subsides, they can turn off those services. Likewise, if a company needs to perform a complex analysis or test a machine learning algorithm, it can rent nearly limitless computing power from a cloud provider for a few hours, rather than incurring the cost of owning it.

In practice, businesses tend to scale up their cloud usage but dont often scale it back down, said Quinn, whose firm helps companies manage their AWS bills and has customers that spend in aggregate about $1 billion a year on Amazons cloud.

Whether by strategy or neglect, they opt to incur higher cloud-computing bills rather than risk constricting capacity and upsetting users, he said. That may start to change as businesses consider belt-tightening measures in the next recession.

That is going to come under an increased level of scrutiny almost certainly when companies start looking at where they are able to cut costs, Quinn said.

But even if cloud customers start combing through their bills, for most, theres only so much they can cut. While some customer-facing applications can scale up and down with demand, and discretionary development projects can be put on hold, other cloud applications that are core to a business basic operations dont change much with revenue fluctuations.

Thats a change from the Great Recession of 2007-2009, when cloud computing was a relatively small feature of the information-technology landscape, used for discrete applications or by small teams within an organization.

The difference now is that there are entire companies running their computing environments [in the cloud] at a scale that weve never seen anything remotely close to, Quinn said.

Cloud providers also offer steep discounts for multiyear spending commitments, which could make it harder for a customer to trim their cloud spending rapidly, Quinn said.

Lovelock said cash-flow constraints in a recession could also prompt businesses to use more cloud services rather than buy their own information-technology equipment.

Thats what happened with SalesForce in the Great Recession. The cloud-based provider of customer-relationship-management software saw revenue grow 21% year over year in 2009, while the broader software category shrunk 3%, he said.

There was still business to be done, but there was limited cash flow, Lovelock said. Cloud computing or software as a service, as it was more commonly called then became the way things got done.

Another factor potentially helping cloud computing weather a possible coronavirus-driven recession: With more people in self-quarantine to avoid contracting or spreading the illness, cloud-based applications for telecommuting and entertainment could see even more usage though many video heavy applications, such as Netflix, are distributed through private content delivery networks.

Cloud software revenue grew through the last recession, Lovelock said, but spending on hardware took a hit. Thats what Lovelock expects to happen in the event of a coronavirus-driven recession. Disruptions to the hardware supply chain are already being felt, particularly given the heavy concentration of semiconductor production in Wuhan, China, the epicenter of the coronavirus outbreak.

Tellingly, Microsoft last week revised its quarterly sales guidance, citing a slower-than-anticipated return to normal operations in its hardware supply chain. The guidance update made no mention of impacts to Microsofts cloud computing or other businesses.

And so far, neither of the Seattle-areas cloud giants appears to be slowing its hiring.

On Tuesday afternoon, Amazon had 14,217 job listings for its Amazon Web Services business more than a third of the companys total openings. Microsoft does not allow its job listings to be filtered the same way, but the word Azure appeared in more than half of its 7,301 listings.

Neither company would comment for this story.

(Anika Varty / The Seattle Times)

More:
Amazon, Microsoft cloud-computing can weather a recession and coronavirus, analysts say - Seattle Times

Microsoft To Keep The Universal Cloud Print Party Going After Google Shut Off The Lights – Hot Hardware

Microsoft is getting ready to introducing a cloud-based printing infrastructure called Universal Print, which is aimed at making it easier for commercial and educational customers to print from anywhere with Microsoft 365, which is in bundle of Windows 10, Office 365, and Enterprise Mobility + Security. The feature is currently being tested in private preview form.

"Universal Print moves key Windows Server print functionality to the Microsoft 365 cloud, so organizations no longer need on-premises print servers and do not need to install printer drivers on devices. In addition, Universal Print adds key functionality like security groups for printer access, location-based printer discovery, and a rich administrator experience," Microsoft explains.

"The way people work is changing as cloud computing and technology continue to expand and evolve, driving digital transformation. Canon Inc.'s imageRUNNER ADVANCE and Office Printers provide the flexibility and scalability to address diverse workplace needs. In partnership with Microsoft, we are committed to supporting Universal Print and support our customers in their journey to the digital workplace." said Isamu Sato, Senior General Manager, Office Imaging Products Operations, Canon Inc.

Existing printers that lack native support can use a Universal Print proxy application that will connect it to the service. That's probably part of what Microsoft is testing in preview form at the moment.

').insertAfter(jQuery('#initdisqus')); } loadDisqus(jQuery('#initdisqus'), disqus_identifier, url); } else { setTimeout(function () { disqusDefer(); }, 50); } } disqusDefer(); function loadDisqus(source, identifier, url) { if (jQuery("#disqus_thread").length) { jQuery("#disqus_thread").remove(); } jQuery('

').insertAfter(source); if (window.DISQUS) { DISQUS.reset({ reload: true, config: function () { this.page.identifier = identifier; this.page.url = url; } }); } else { //insert a wrapper in HTML after the relevant "show comments" link disqus_identifier = identifier; //set the identifier argument disqus_url = url; //set the permalink argument //append the Disqus embed script to HTML var dsq = document.createElement('script'); dsq.type = 'text/javascript'; dsq.async = true; dsq.src = 'https://' + disqus_shortname + '.disqus.com/embed.js'; jQuery('head').append(dsq); } jQuery('.show-disqus').show(); source.hide(); }; function disqusEvent() { idleTime = 0; }

Read this article:
Microsoft To Keep The Universal Cloud Print Party Going After Google Shut Off The Lights - Hot Hardware

Microsoft officially announces Universal Print, a cloud-based print server solution – MSPoweruser – MSPoweruser

Last week, we reported that Microsoft is working on a new cloud-based print solution is called Universal Print. Today, Microsoft officially announced Universal Print, a cloud-based print infrastructure that will enable a simple, rich and secure print experience for users and help reduce time and effort for IT.

Right now, large organizations deploy Print Servers which connects printers with client computers over a network. Print Servers manages the jobs initiated by users based on the printer availability. The new Universal Print moves Print Server functionality to the cloud, so organizations no longer need on-premises print servers. Also, organizations do not need to install printer drivers on devices. Universal Print also offers like like security groups for printer access, location-based printer discovery, and a rich admin experience for IT department.

Universal Print in Azure portal will allow admins to manage their print devices without deploying print servers:

Interested enterprise users can sign-up for Microsoft Universal Print private preview here. Microsoft is working with printer OEMs like Canon to add native support for Universal Print in latest printers. For existing printers, organization can use a Universal Print proxy application that connects printers to Universal Print.

Source: Microsoft

Read more:
Microsoft officially announces Universal Print, a cloud-based print server solution - MSPoweruser - MSPoweruser

Benefits of Kubernetes on bare metal cloud infrastructure – Ericsson

Cloud native technology and the role of CNCF

Lets start with a brief recap on cloud native and CNCF Cloud Native Computing Foundation. With the introduction of 5G, new use cases drive the need for designing applications based on containers and service-based architecture to address some technology gaps associated with virtualization. The most important technology gaps involve smoother software upgrades, automation and the realization of a CI/CD software pipeline to end customers.

In the center of cloud native technology development is CNCF, an open source community driving the adoption of the cloud native paradigm across industries by fostering collaboration between the industrys top developers, end users and vendors. Since CNCF is such a huge community, the focus on the telecom industry has been limited. But now with the formation of the Telecom User Group (A Special Interest Group) within CNCF, Ericsson has taken a leading role in telecom related discussions in the community.

The current industry trend is to deploy Kubernetes in virtual machines hosted on a virtualized platform within an NFVI solution. This deployment approach works well as an intermediate step for introducing cloud native applications, but deploying Kubernetes in virtual environments adds additional cost and complexity to the cloud infrastructure.

In order to further simplify the deployment of Kubernetes and to leverage the full benefits and efficiency promise of the cloud native technology, the underlying cloud infrastructure needs to be optimized. Thats why Ericsson recently launched a new cloud infrastructure solution optimized for cloud native applications.

The solution, Ericsson Cloud Native Infrastructure, is using a bare metal cloud native architecture which means that no virtualization layer is needed in the cloud stack. Instead, cloud native applications are deployed in containers running directly on the bare metal cloud, which results in a radical simplification of the network implementation. It is important to note that virtual network functions (VNFs) will be around for years to come, and they will continue to run on NFVI.

Ericsson Cloud Native Infrastructure enables a fully automated deployment of Kubernetes layer over bare metal servers without the need for an underlying virtualized layer. This evolution in NFVI is a huge step forward in simplifying the deployment and operations of telecom cloud native applications.

Ericsson estimates that the TCO savings for deploying Kubernetes on a bare metal compared to a virtualized infrastructure can be as high as 30 percent, depending on application and configuration. Ericsson Cloud Native Infrastructure will be available on the market later this year to support CSPs 5G and cloud native introduction. View this brief video about Ericsson Cloud Native Infrastructure summarizing the next step in cloud infrastructure evolution.

Watch the Telecom TV interview where Peter Wrndle, expert on NFV at Business Unit Digital Services at Ericsson discusses how far CSP's has come in adoption to cloud native technologies.

Go here to see the original:
Benefits of Kubernetes on bare metal cloud infrastructure - Ericsson

Kneron Named to the 2020 CB Insights AI 100 List of Most Innovative Artificial Intelligence Startups – Yahoo Finance

Innovation leader honored for achievements in the on-device edge AI industry

Kneron, Inc., a leading on-device edge artificial intelligence (AI) company based in San Diego, California, was named by CB Insights to the fourth annual AI 100 ranking, showcasing the 100 most promising private AI companies in the world.

"To be listed on the AI 100 is an honor," stated Albert Liu, Knerons Cofounder and CEO. "It represents our determination to expand AI inferencing from the cloud to the edge so that private user data can be more secure, and edge AI devices and applications can be more ubiquitous in our everyday lives. Were excited and inspired to see our work being recognized by CB Insights."

Knerons on-device edge AI solutions include AI chips and AI software models that enhance smart devices with AI functions without the constant need to be connected to a cloud-based AI service because the AI inferencing happens where the data is collected. This greatly reduces the bandwidth needed to share vast amounts of private user data to be computed and stored in cloud servers and the possibility that the data can be compromised, leaked, or hacked.

With real solutions being employed by partners in the market already, Kneron is pioneering the growth of the on-device edge AI industry.

"It's been remarkable to see the success of the companies named to the Artificial Intelligence 100 over the last four years. The 2019 AI 100 saw 48 companies go on to raise $4.9B of additional financing and nine got acquired," said CB Insights CEO Anand Sanwal. "It has been gratifying to see that CB Insights' data-driven approach to identifying the top AI companies using patents, customer traction, investor quality, market sizing and more has become so effective at picking the AI winners of tomorrow. We look forward to seeing what the 2020 AI 100 companies will accomplish over the course of this year and beyond."

Through an evidence-based approach, the CB Insights research team selected the AI 100 from nearly 5,000 companies based on several factors including patent activity, investor quality, news sentiment analysis, proprietary Mosaic scores, market potential, partnerships, competitive landscape, team strength, and tech novelty. The Mosaic Score, based on CB Insights algorithm, measures the overall health and growth potential of private companies to help predict a companys momentum.

About Kneron

Kneron, established in San Diego in 2015, is a leading provider of on-device edge AI solutions. It is dedicated to the design and development of integrated software and hardware solutions for the smart home, smart surveillance, smartphones, personal computers, robots, drones, and other IoT devices. Their products include AI chips and AI models that accelerate on-device AI inferencing that enhance privacy and security. Knerons mission is to enable AI everywhere through their reconfigurable solutions that allows efficient processing of image and audio AI models of the present and future, and making affordable the adoption of on-device edge AI. Kneron's investors include Horizons Ventures, Qualcomm, Sequoia Capital, and more. To date, Kneron has received financing of US$73 million. For more information about Kneron, please visit http://www.kneron.com.

About CB Insights

CB Insights helps the worlds leading companies accelerate their digital strategy and transformation efforts with data, not opinion. Our Emerging Tech Insights Platform provides companies with actionable insights and tools to discover and manage their response to emerging technology and startups. To learn more, please visit http://www.cbinsights.com.

View source version on businesswire.com: https://www.businesswire.com/news/home/20200303005868/en/

Contacts

Jason Zhengjason.zheng@kneron.us

CB Insightsawards@cbinsights.com

Read the rest here:
Kneron Named to the 2020 CB Insights AI 100 List of Most Innovative Artificial Intelligence Startups - Yahoo Finance

Investing in as-a-service to light a SPARC under IT modernization – GCN.com

INDUSTRY INSIGHT

Federal IT teams are caught in a vicious modernization cycle. Theyre trying to follow the guidelines laid out by the Data Center Optimization Initiative and the Trump administrations Cloud Smart policy, which call for IT consolidation, optimization and cloud migration. Yet theyre also reliant -- both financially and technologically -- on established legacy systems, like Oracles Scalable Process Architecture server (SPARC). For years, theyve developed systems that are now costing agencies millions of dollars in maintenance fees.

In many cases, IT administrators keep investing in these older systems because they feel like they have no choice -- but now they do.

Instead of paying high maintenance costs or making a major capital commitment to rip and replace existing infrastructure, agencies can invest in as-a-service models that upgrade legacy systems and create a gateway toward a modern, multicloud future. In doing so, they can maximize performance and reduce costs over the long term.

Lets take a look at how agencies can upgrade their aging infrastructure and pave the way to a more cost-effective, consolidated and higher performing future. Well use SPARC as an example of how they can migrate from an outdated system to a modern, cloud-centric architecture.

A dwindling SPARC

SPARC was first introduced by Oracle in 1987 and has since become nearly ubiquitous in government circles. Agencies use the server to manage intensive applications and workloads, from system applications to enterprise resource planning and beyond. The thought of migrating these applications off of this infrastructure understandably gives these organizations pause -- they think its too expensive, too risky, too time-consuming.

But Oracle has already announced SPARCs end-of-life. Like any technology thats several decades old, agencies that are still using it pay a premium to maintain it. Fortunately, there is a better way for agencies to keep their mission focus and continue on the path toward modernization.

Rekindle SPARC with infrastructure as a service

Migrating to a private cloud, infrastructure-as-a-service (IaaS) model is an ideal option. With this model, agencies can upgrade to a modern, secure, reliable and flexible infrastructure that addresses their modernization needs without the disruption and cost associated with retrofitting. They wont need to rewrite code, upgrade their operating systems or experience downtime. They can eliminate technical and operational concerns while supporting the requirements of consolidation, optimization and the cloud.

The movement to an IaaS model can be done over time and in a modular fashion. Administrators can develop an interim upgrade strategy for critical legacy applications while they work on creating a long-term migration plan that will benefit their agencies over the next several years.

This approach lets them develop a strategic cloud model -- private, public, hybrid, or multicloud -- that best suits their unique needs. For instance, many organizations choose multicloud to solve their most complex IT challenges. IaaS can support multicloud initiatives by allowing agencies to auto-deploy applications with the proper configurations on the correct cloud platforms.

Ignite cost reductions

In addition to setting a path toward the future, upgrading from legacy SPARC to a consumption-based service can help agencies meet todays consolidation and optimization objectives.

IaaS can significantly reduce the need for expensive and space-consuming legacy hardware, thereby allowing agencies to save on resources. By migrating to an IaaS model, for example, agencies can save money on everything from cooling and power costs to maintenance bills. They wont have to worry about outdated technology or the usual three- to four-year technology refresh cycle that requires a significant investment.

Savings can then be allocated to initiatives that will add value to the agencys mission over the long haul. Operating expenses can be used to upgrade organizational infrastructure and invest in new initiatives, including the development of applications that can benefit warfighters and citizens.

Fan the flames of innovation

Building platforms that bridge the past, present and future requires a strong infrastructure. With IaaS, agencies can optimize their legacy applications, yet realize opportunities to innovate. They can create an ideal environment upon which they can leverage modern technologies like artificial intelligence, machine learning and more.

Agencies that adopt this approach will be able to enjoy far more flexibility than they have with their legacy architectures. They can turn resources on or off and shift suitable workloads to the right cloud while keeping other applications on-premises. And, they can significantly expedite the deployment of new services to support new government initiatives.

IaaS is a fiscally and technologically smart approach that can get agencies out of the modernization cycle and onto a smart path to their most effective cloud model.

About the Author

Rob Davies is executive vice president of operations at ViON.

Read the original:
Investing in as-a-service to light a SPARC under IT modernization - GCN.com

The Top 10 Best Cloud Storage Books You Need to Read in 2020 – Solutions Review

Sometimes, in order to advance technologically, its helpful to look at an old-school approach. Cloud storage is a great option for your organization, but you should first ensure that it is right for you and your team. While there are many free resources available online, (such as Solutions Reviews best practices articles,solutions directories,andbuyers guides), doing things the old-fashioned way can be beneficial. Solutions Review has taken the liberty of doing the research for you, having reviewed a multitude of books. Weve carefully selected the 10 best cloud storage books based on relevance, popularity, review ratings, publish date, and ability to add business value. Here they are in no particular order.

Cloud Storage Made Easy: Securely Backup and Share Your Files

By James Berstein

Cloud Storage Made Easy was written to help you get an understanding of what cloud storage is and how its used. The focus of this book is to get you up to speed on the concepts of cloud storage and the most popular home and small business cloud storage platforms so you can make an educated choice as to what cloud storage service will work best for you.

The History of Data Storage

By Larry Freeman

This book contains an in-depth look at the products and innovations that created the digital data storage industry we live in today. Suitable as an academic reference, it contains a first-person account of the past, present, and likely future trends in the data storage industry.

The Art of Capacity Planning: Scaling Web Resources in the Cloud

By Arun Kejariwal and John Allspaw

In their early days, Twitter, Flickr, Etsy, and many other companies experienced sudden spikes in activity that took their web services down in minutes. Today, determining how much capacity you need for handling traffic surges is still a common frustration of operations engineers and software developers. This hands-on guide provides the knowledge and tools you need to measure, deploy, and manage your web application infrastructure before you experience explosive growth.

Building a Future-Proof Cloud Infrastructure: A Unified Architecture for Network, Security, and Storage Services

By Silvano Gai

Network pioneer Silvano Gai demonstrates DS Platforms remarkable capabilities and guides you through implementing them in diverse hardware. Building a Future-Proof Cloud Architecture is for network, cloud, application, and storage engineers, security experts, and every technology professional who wants to succeed with tomorrows most advanced service architectures.

iCloud for Beginners: A Ridiculously Simple Guide to Online Storage

By Scott La Counte

Photos used to be relatively small, but as cameras have advanced, the size goes up. Most photos on your phone are several MB big. iCloud means you can keep the newest ones on your phone and put the older ones in the cloud. It also means you dont have to worry about paying for the phone with the biggest hard drivein fact, even if you have the biggest hard drive, theres a chance it wont fit all of your photos. This short book will help new users navigate their way around the cloud service.

Information Storage and Management: Storing, Managing, and Protecting Digital Information in Classic, Virtualized, and Cloud Environments

By EMC Education Services

This new edition of the unparalleled bestseller serves as a full training course all in one and as the worlds largest data storage company, EMC is the ideal author for such a critical resource. They cover the components of a storage system and the different storage system models while also offering essential new material that explores the advances in existing technologies and the emergence of the Cloud as well as updates and vital information on new technologies.

Database Cloud Storage: The Essential Guide to Oracle Automatic Storage Management

By Nitin Vengurlekar

Build and manage a scalable, highly available cloud storage solution. Filled with detailed examples and best practices, this Oracle Press guide explains how to set up a complete cloud-based storage system using Oracle Automatic Storage Management. Find out how to prepare hardware, build disk groups, efficiently allocate storage space, and handle security.Database Cloud Storage: The Essential Guide to Oracle Automatic Storage Management shows how to monitor your system, maximize throughput, and ensure consistency across servers and clusters.

Learning Microsoft Azure Storage: Build Large-Scale, Real-World Apps by Effectively Planning, Deploying, and Implementing Azure Storage Solutions

By Mohamed Waly

You will start this book with an introduction to Microsoft Azure storage and how it can be used to build large-scale, real-world applications using Azure storage services such as blob, table, queue, and file. This book will also teach you about the different types of Azure Storage. You will then find out the best practices for designing your Azure VM storage, whether it is Windows-based or Linux based and how to migrate your storage in different scenarios.

Data Storage Networking: Real World Skills for the CompTIA Storage+ Certification and Beyond

By Nigel Poulton

This book covers data storage from the basics to advanced topics, and provides practical examples to show you ways to deliver world-class solutions. In addition, it covers all the objectives of the CompTIA Storage+ exam (SG0-001), including storage components, connectivity, storage management, data protection, and storage performance.

Cloud Storage Forensics

By Darren Quick, Ben Martini, and Raymond Choo

Cloud Storage Forensics presents the first evidence-based cloud forensic framework. Using three popular cloud storage services and one private cloud storage service as case studies, the authors show you how their framework can be used to undertake research into the data remnants on both cloud storage servers and client devices when a user undertakes a variety of methods to store, upload, and access data in the cloud.

Tess Hanna is an editor and writer at Solutions Review covering Backup and Disaster Recovery, Business Process Management, and Talent Management. She aims to simplify the research process for IT professionals. You can contact her at thanna@solutionsreview.com

Related

Follow this link:
The Top 10 Best Cloud Storage Books You Need to Read in 2020 - Solutions Review

Micro server IC market is expected to reach USD 1.35 Billion by 2022, at a CAGR of 44.0% – Virtual-Strategy Magazine

In the present scenario, data traffic is continually rising, driving the demand for secure and reliable storage and processing of data. To accommodate this large data, companies are deploying new data centers or upgrading their existing data centers and developing hyperscale or mega data centers. Other major drivers for the growth of the market are low power consumption and space usage by micro servers, growing trend of cloud computing.

The global micro server IC market is expected to reach USD 1.35 Billion by 2022, at a CAGR of 44.0% between 2016 and 2022. In the present scenario, data traffic is continually rising, driving the demand for secure and reliable storage and processing of data. To accommodate this large data, companies are deploying new data centers or upgrading their existing data centers and developing hyperscale or mega data centers. Other major drivers for the growth of the market are low power consumption and space usage by micro servers, growing trend of cloud computing.

The medium sized organizations are adopting micro server ICs owing to their lower buying and installing cost. In addition, it is easy to upgrade the system if the load increases by simply increasing the server nodes. Thus, medium scale enterprise segment holds the largest market for micro server ICs. These organizations need to scale immediately whenever required to stay competitive in the market. Micro servers help such organizations to meet their IT requirements as they are easy to set up, use, and maintain without taking up a lot of space. They provide plenty of power, security, and expansion without requiring dedicated IT resources, which are imperative for medium enterprises.

Download Free PDF Brochure @

https://www.marketsandmarkets.com/pdfdownloadNew.asp?id=952

A data center is a facility used to house computer systems and associated components, such as telecommunications and storage systems. Micro servers are less expensive and consume less power and space than traditional enterprise-class rack servers. They can easily be grouped into clusters and are well suited for tasks that do not require multiple CPUs. The growing need for data centers in various business domains such as IT& telecom, banking, healthcare, agriculture, and government, among others would drive the data center application to grow at the highest CAGR during the forecast period.

The micro server IC market is expected to grow at the highest CAGR in APAC as countries such as India and China are witnessing huge technological advancements and the region is expected to witness many startups and business expansions in the coming years. Currently, the demand for application specific servers is huge. The main reason for this is the increasing need for data centers and cloud computing in fast developing economies such as India and China that still have a huge potential for internet connectivity. The application specific customization capability of micro servers provides a better option for data centers and cloud computing companies to use the micro servers as per the workload need. Thus, such features of micro servers would create huge demand for micro servers in the APAC region.

Media ContactCompany Name: MarketsandMarketsContact Person: Mr. Sanjay GuptaEmail: Send EmailPhone: 18886006441Address: 630 Dundee Road Suite 430City: NorthbrookState: IL 60062Country: United StatesWebsite: https://www.marketsandmarkets.com/Market-Reports/micro-servers-market-952.html

See the original post here:
Micro server IC market is expected to reach USD 1.35 Billion by 2022, at a CAGR of 44.0% - Virtual-Strategy Magazine