Page 3,934«..1020..3,9333,9343,9353,936..3,9403,950..»

Global Cloud Server Market 2020 By Analysis -Worldwide Opportunities, Revenue, Production, Demand and Geographical Forecast To 2025 Dagoretti News -…

The research report on Global Cloud Server Market offers the regional as well as global market information which is estimated to collect lucrative valuation over the forecast period. The Global Cloud Server Market report also comprises the registered growth of Global Cloud Server Market over the anticipated timeline and also covers a significant analysis of this space. Additionally, the Global Cloud Server Market report focuses on the number of different crucial aspects to the remuneration recently which are held by the industry. Moreover, the Global Cloud Server Market report analyzes the market segmentation as well as the huge number of lucrative opportunities offered across the industry.

According to the Global Cloud Server Market report, the multi-featured product offerings may have a high positive influence on the Global Cloud Server Market and it contributes to the market growth substantially during the prediction period. The Global Cloud Server Market research report also covers many other significant market trends and crucial market drivers which will impact on the market growth over the forecast period.

This study covers following key players:IBMHPDellOracleLenovoSugonInspurCISCONTTSoftlayerRackspaceMicrosoftHuawei

Request a sample of this report @ https://www.orbismarketreports.com/sample-request/65000

The Global Cloud Server Market report includes substantial information related to the market driving forces which are highly influencing the vendor portfolio of the Global Cloud Server Market and its impact on the market share in terms of revenue of this industry. Likewise, the Global Cloud Server Market report analyzes all the current market trends by classifying them in a group of challenges as well as opportunities that the Global Cloud Server Market will present into the coming years.

In addition, the shift in customer focus towards alternate products may restrict the demand for the Global Cloud Server Market among consumers. Hence, such factors are responsible for hindering the growth of the Global Cloud Server Market. Furthermore, the Global Cloud Server Market is highly concentrated as the few leading players present in the market. However, major players in this market are continually concentrating on innovative or multi-featured solutions which will offer huge benefits for their business.

The Global Cloud Server Market research report focuses on the manufacturers data such as price, gross profit, shipment, business distribution, revenue, interview record, etc., such information will help the users to know about the major players of competitor better. In addition, the Global Cloud Server Market report also focuses on the countries and regions of the globe, which presents a regional status of the market including volume and value, market size, and price structure.

Market segment by Type, the product can be split intoLogical TypePhysical Type

Access Complete Report @ https://www.orbismarketreports.com/global-cloud-server-market-size-status-and-forecast-2019-2025

Additionally, the Global Cloud Server Market report will assist the client to recognize fresh and lucrative growth opportunities and build unique growth strategies through a complete analysis of the Global Cloud Server Market and its competitive landscape and product offering information provided by the various companies. The Global Cloud Server Market research report is prepared to offer the global as well as local market landscape and the number of guidelines related to the contemporary market size, market trends, share, registered growth, driving factors, and the number of dominant competitors of the Global Cloud Server Market.

The Global Cloud Server Market report covers all the significant information about market manufacturers, traders, distributors, and dealers. However, this information helps clients to know the product scope, market driving force, market overview, market risk, technological advancements, market opportunities, challenges, research findings, and key competitors. In addition, the Global Cloud Server Market report will offer an in-depth analysis of the upstream raw material as well as downstream demand of the Global Cloud Server Market.

Market segment by Application, split intoEducationFinancialBusinessEntertainmentOthers

For Enquiry before buying report @ https://www.orbismarketreports.com/enquiry-before-buying/65000

Some TOC Points:

1 Report Overview2 Global Growth Trends3 Market Share by Key Players4 Breakdown Data by Type and Application

Continued

About Us:

With unfailing market gauging skills, Orbis Market Reports has been excelling in curating tailored business intelligence data across industry verticals. Constantly thriving to expand our skill development, our strength lies in dedicated intellectuals with dynamic problem solving intent, ever willing to mold boundaries to scale heights in market interpretation.

Contact Us:Hector CostelloSenior Manager Client Engagements4144N Central Expressway,Suite 600, Dallas,Texas 75204, U.S.A.Phone No.: USA: +1 (972)-362-8199 | IND: +91 895 659 5155

Originally posted here:
Global Cloud Server Market 2020 By Analysis -Worldwide Opportunities, Revenue, Production, Demand and Geographical Forecast To 2025 Dagoretti News -...

Read More..

The trillionaires club: How big data, cloud and AI are powering the tech economy – SiliconANGLE

Last week Google LLCs parent company Alphabet Inc. became thefourth U.S. firm to enter the trillionaires club,joining the likes of Apple Inc., Microsoft Corp. and Amazon.com Inc. in achieving a $1 trillion market capitalization.

In his latest breaking analysis video, Dave Vellante, chief analyst at SiliconANGLE sister market research firm Wikibon and co-host of SiliconANGLEs video studio theCUBE, said Alphabets and Googles success is just the latest example of how growth in the tech industry is being driven by an innovation cocktail centered on artificial intelligence, big data and the cloud.

The source of innovation in the technology business has been permanently altered, Vellante said. There is a new cocktail of innovation that will far surpass Moores law in terms of its impact on the industry.

The analyst explained that for decades, innovation in the technology industry has gone hand in hand with Moores Law, which refers to the observation that the number of transistors in a dense integrated circuit doubles about every two years. The basic premise of Moores Law is that whoever is first with the latest and greatest microprocessors has a significant competitive advantage over its rivals, and its this that has propelled the industry though the PC era and the client server era.

Now, though, as Moores Law slows, its being pushed aside by a combination of technologies including big data, cloud, and machine intelligence and AI, which is likely to power a new era of innovation for the next 20 years and beyond.

The cloud brings three things: agility, scale and the ability to fail quickly and cheaply, Vellante said. So it is these three elements and how they are packaged and applied that will in my view determine winners and losers in the next decade and beyond.

Vellante said this new era of innovation is being driven by three factors: the availability of cheap storage and compute, the emergence of new processor types such as graphics processing units that can power AI workloads effectively, and the vast troves of big data that the major technology companies have accumulated.

For evidence of this shift, we need look no further than the market caps of the top five public companies in the U.S., namely Apple, Microsoft, Google, Amazon and Facebook Inc. Their success and their huge market valuations are primarily thanks to the fact that theyve all emerged as leaders in digital and are among the best-positioned to apply machine intelligence to the massive stores of data they possess.

A quick look at Enterprise Technology Researchs latest market share data, which is a measure of pervasiveness in its spending surveys, seems to underscore that conviction. Over the last decade, Microsoft has maintained its presence in the survey at the 70% and 90% range, while AWS and Googles importance has risen slowly but steadily during the same period.

If I superimposed traditional enterprise players like Cisco, IBM, Hewlett Packard, Dell, et cetera that is companies that arent competing with data at the core you would see a steady decline, Vellante said.

AlthoughApple with its iPhone perhaps doesnt quite fit with this story, its real value and the key determining factor in its success has to do with how it combines data with machine intelligence to compete in apps, content and digital services.

The sheer value of that data has enabled those companies to expand into new markets, further driving their growth. Take Amazon, for example, with its recent moves into content, groceries, logistics and so on. Many of these moves are being financed by the companys cloud infrastructure business, which represents just 12% of its total turnover but almost half of its operating income.

Additional data from ETR shows theres a lot of spending action around some of the latest cocktail technologies that are powered by AI, cloud and big data. For example, spending on container orchestration is likely to increase by 29% this year, while container platforms will grow 19.7% and machine learning and AI will rise 18%.

The ETR data shows that the spending action is around cloud, AI and Data, Vellante said. And in the red are Moores Law technologies like servers and storage.

At first glance, this all seems like bad news for legacy enterprise incumbents such as IBM Corp. and Hewlett-Packard Enterprise Co., which are not digital natives and were not born in the cloud. But the future is far from certain, and members of the trillionaires club need to wary of complacency.

While the trillionaires look invincible today, history suggests they are not invulnerable, Vellante said. The rise of China, India, open source and open models could coalesce and disrupt these guys if they miss a step.

For the incumbent enterprises, the good news is they dont have to build any new technology to better compete. Rather, what they need to do is apply machine learning to their unique data models and buy technologies such as AI and cloud that they need from existing suppliers.

The degree to which they are comfortable buying from these suppliers who may also be competitors will play out over time but I would argue that building that competitive advantage sooner rather than later with data and learning to apply machine intelligence to their unique businesses will allow them to thrive, Vellante said.

Heres Vellantes full analysis:

Show your support for our mission with our one-click subscription to our YouTube channel (below). The more subscribers we have, the more YouTube will suggest relevant enterprise and emerging technology content to you. Thanks!

Support our mission: >>>>>> SUBSCRIBE NOW >>>>>> to our YouTube channel.

Wed also like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we dont have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary onSiliconANGLE along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams attheCUBE take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.

If you like the reporting, video interviews and other ad-free content here,please take a moment to check out a sample of the video content supported by our sponsors,tweet your support, and keep coming back toSiliconANGLE.

See the article here:
The trillionaires club: How big data, cloud and AI are powering the tech economy - SiliconANGLE

Read More..

Nebulon gets less nebulous, aims to replace SAN and hyperconverged storage – Blocks and Files

Nebulon, a cloud-defined storage startup founded by four 3PAR veterans, is opening the stealth curtain a little.

On its website, the California company has revealed it is developing a cloud-managed replacement for Fibre Channel SANs, hyperconverged systems and software-defined storage. The system uses commodity SSDs and provides cloud-managed, secure, autonomous operations for server-centric storage on-premises.

Nebulon tells infrastructure owners its product is a server-based data storage approach that is virtually zero touch, a fraction of the cost of arrays and SANs, without compromising SLAs.

Nebulon software manages single tier all-flash storage with enterprise-class reliability and low latency in the public cloud (currently AWS). This back-end management is said to be AI-based and provides composable operations and automatic updates for the on-premises software. It can manage multiple Nebulon arrays.

Job adverts posted on Nebulons site indicates some product characteristics. For example, Contribute to the design, implementation, and operation of cloud-based management, telemetry, operations, and analysis software.

This implies that the on-premises system sends telemetry to the cloud back-end management facility. The composability point indicates that the on-premises storage facility can be re-organised.

We get a sense of a management facility receiving telemetry from the on-premises system and running AI-type analytics on the data, with real-time control info returned to the on-premises system to optimise operations.

The on-premises component has a customer web interface. Nebulons system is said to have zero footprint and provides maximum workload density for any OS/hypervisor with no additional server software required.

This suggests that the hardware is a server with populated SSD bays and Nebulon software. It is called a buy-as-you-need application server. Taken literally, this means it runs customers applications and is therefore a kind of hyperconverged system.

We think the zero footprint claim is more marketing than something to be taken literally. The on-premises kit has Nebulon software running in it so it is not a zero footprint device in that sense.

There is no announcement date but customers willing to be beta testers are invited to get in touch. Blocks & Files thinks a Q3 or Q4 2020 launch is likely merely based on our sense of the industry and nothing concrete. Nebulons product launch remains nebulous.

Here is the original post:
Nebulon gets less nebulous, aims to replace SAN and hyperconverged storage - Blocks and Files

Read More..

Global Private Cloud Server Market Growth Opportunities, Challenges, Competitive Analysis And Forecast 2020-2025 – Expedition 99 – Expedition 99

The research report on the Private Cloud Server market offers a comprehensive study on market share, size, growth aspects, and major players. In addition, the report contains brief information about the regional competitive landscape, market trends, and drivers, opportunities and challenges, distributors, sales channels, risks & entry barriers, as well as Porters Five Forces Analysis. Moreover, the main objective of this report is to offer a detailed analysis of how the market aspects potentially influence the coming future of the Private Cloud Server market. The report also offers a comprehensive analysis about the competitive manufacturers as well as the new entrants also studies along with their brief research.

Request sample here : https://www.orbisresearch.com/contacts/request-sample/2354947

In addition, this report also contains a price, revenue, market share, and production of the service providers is also mentioned with accurate data. Moreover, the global Private Cloud Server report majorly focuses on the current developments, new possibilities, advancements, as well as dormant traps. Furthermore, the Private Cloud Server market report offers a complete analysis of the current situation and the advancement possibilities of the Private Cloud Server market across the globe. This report analyses substantial key components such as production, capacity, revenue, price, gross margin, sales revenue, sales volume, growth rate, consumption, import, export, technological developments, supply, and future growth strategies.

Moreover, the Private Cloud Server report offers a detailed analysis of the competitive landscape in terms of regions and the major service providers are also highlighted along with attributes of the market overview, business strategies, financials, developments pertaining as well as the product portfolio of the Private Cloud Server market. Likewise, this report comprises significant data about market segmentation on the basis of type, application, and regional landscape. The Private Cloud Server market report also provides a brief analysis of the market opportunities and challenges faced by the leading service provides. This report is specially designed to know accurate market insights and market status

The key players covered in this study

AmazonMicrosoftGoogleDropboxSeagateEgnyteBuffalo TechnologySpiderOakMEGAD-LinkElephantDriveMozy Inc.POLKASTDellJust CloudSugarsync

Market segment by Type, the product can be split into

User HostProvider Host

Market segment by Application, split into

IndividualSmall BusinessLarge Organizations

Market segment by Regions/Countries, this report covers

United StatesEuropeChinaJapanSoutheast AsiaIndiaCentral & South America

Get the DISCOUNT on this report : https://www.orbisresearch.com/contacts/discount/2354947

The study objectives of this report are:

To analyze global Private Cloud Server status, future forecast, growth opportunity, key market and key players.To present the Private Cloud Server development in United States, Europe and China.To strategically profile the key players and comprehensively analyze their development plan and strategies.To define, describe and forecast the market by product type, market and key regions.

In this study, the years considered to estimate the market size of Private Cloud Server are as follows:

History Year: 2013-2017Base Year: 2017Estimated Year: 2018Forecast Year 2018 to 2025

Major Points From Table of Content:

Chapter One: Report OverviewChapter Two: Global Growth TrendsChapter Three: Market Share by Key PlayersChapter Four: Breakdown Data by Type and ApplicationChapter Five: United StatesChapter Six: EuropeChapter Seven: ChinaChapter Eight: JapanChapter Nine: Southeast AsiaChapter Ten: IndiaChapter Eleven: Central & South AmericaChapter Twelve: International Players ProfilesChapter Thirteen: Market Forecast 2018-2025Chapter Fourteen: Analysts Viewpoints/ConclusionsChapter Fifteen: Appendix

Browse the complete report : https://www.orbisresearch.com/reports/index/global-private-cloud-server-market-size-status-and-forecast-2018-2025

About Us:

Orbis Research (orbisresearch.com) is a single point aid for all your market research requirements. We have vast database of reports from the leading publishers and authors across the globe. We specialize in delivering customized reports as per the requirements of our clients. We have complete information about our publishers and hence are sure about the accuracy of the industries and verticals of their specialization. This helps our clients to map their needs and we produce the perfect required market research study for our clients.

Contact Us:

Hector CostelloSenior Manager Client Engagements4144N Central Expressway,Suite 600, Dallas,Texas 75204, U.S.A.Phone No.: +1 (972)-362-8199 ; +91 895 659 5155

Read more:
Global Private Cloud Server Market Growth Opportunities, Challenges, Competitive Analysis And Forecast 2020-2025 - Expedition 99 - Expedition 99

Read More..

Experts discuss industry response to multicloud – IT Brief New Zealand

Article by NetEvents editor Lionel Snell

When everyone first started talking about the cloud, it looked as if the pendulum might be swinging back towards a client/server situation, with the cloud being the server to a worldwide population of relatively thin clients.

Big cloud providers encouraged that model: give us your data and we will sell you access to our services.

But that one cloud has evolved from being a single entity to a broad concept, one that included private as well as public clouds, and then the inevitable hybrid clouds incorporating both public and private clouds.

Now we have multicloud is this just a new name for a hybrid cloud?

As I understood it, the difference should be that a hybrid cloud means an interconnected combination of public and private clouds, so that they become one integrated whole, whereas a multicloud means relying on several cloud services from several vendors for business purposes.

But the two are not distinct: for example, Dell EMC cloud solutions offer to transform IT by leveraging a multicloud approach spanning a variety of public, private and hybrid resources.

And IBMs multicloud solutions page says: Multicloud is a cloud adoption strategy that embraces a mix of cloud models public, dedicated, private, managed to best meet unique business, application and workload requirements.

Wikibon chief research officer and general manager Peter Burris says: The fundamental business objective is to use data as an asset... digital business is about how we are going to put data to work differently.

In particular, data is being used to further the current trend for transforming products into services: and that is just what cloud is already doing in the IT industry an important point because it means that the way the cloud is developing now could be a pattern for the way future businesses will develop.

Instead of repeating the usual clich about data being the new oil, he pointed out what a lousy analogy that was: Data is easily copied. It's easily shared. It's easily corrupted. It does not follow the laws of scarcity and that has enormous implications, certainly for all the vendors on the panel and virtually every enterprise on the planet.

Seeing cloud development as a roadmap for broader, longer-term tech-industry trends does make this a vital topic, and it emphasises the point that the cloud is not about centralising computing on a massive scale, but about creating simpler, more powerful distributed computing.

Rather than pass our data up into some providers cloud, we rather keep the data in place: where it is gathered, where it is most secure, where intellectual property is easiest to protect, and where the actual business takes place.

This is not about moving data into the cloud. This is about moving the cloud and cloud services to the data. Within 10 years the cloud is going to reflect a natural organisation of data, whether it's at the edge, whether it's in the core or whether it's in public cloud attributes.

Cisco cloud platforms and solutions group product management VP Jean-Luc Valente points out that it was one thing to upload a terabyte of data to a cloud, but as the surge in data and applications rises towards exabytes, he says it would cost $30 million to upload just one of those to a public cloud.

This explosion of data at the edge is very serious from a networking and security angle.

Over the decades, networking has evolved from being a means to connect devices, to connecting sites, and connecting pages and individuals on social media so is it moving towards primarily connecting data and services?

According to NetFoundry CEO Galeal Zino, Now that the application is the new edge, and data is everywhere, we actually need to reinvent networking and the ecosystems around networking to match that new reality.

NetScout strategic alliances area VP Michael Segal referenced recent discussions about AI and machine learning using data to train automatic processes.

A lot of this would require analysing data in real-time, so edge computing becomes very important. The data needs to be close to where it's being analysed and where it provides insight in real-time.

Burris emphasises the increasingly critical role of the network: the actual training algorithms they use date way back before 2000, it was just that until recently there wasnt the parallel computing capability to put them to work effectively.

Apstra CEO & founder Mansour Karam is another advocate for this exciting time to be in networking.

He says: Managing networks like before no longer works. You can't manage networks manually by configuring devices by hand. It has to be done through software. It has to be done through powerful automation. You have to have the ability to abstract out all of those network services across all of those domains and you have to have the ability to operate these networks, enforce those policies, set these configurations and verify them remotely in every location where data resides.

So the importance of the multicloud is not where the data lies, but how it is managed in an agile manner by leveraging service mesh technology, applying containers, DevOps, or DevSecOps and once we can manage the edge with that same level of agility and automation, all of a sudden the data and the applications will exist wherever they are best put.

Segal compares this spread to the architecture of the modern data centre, where: A lot of server farms and a lot of east, west traffic and containers and virtualised environments in the data centre itself.

Then you extend it, not necessarily immediately to the public cloud - in some cases to private clouds such as Equinix.

Then you can have several different public cloud providers - Oracle, AWS, Microsoft Azure - and think about the complexities associated with connecting everything, many of them are edge computing environments.

Another point Burris made is that there has been a lot of emphasis on the data explosion, but what about the attendant software explosion as we move into a realm where these distributed services are accessed as both applications and data? Automation and abstraction require software, entities will be defined and policies enforced in software.

There's going to be an enormous explosion in the amount of software that's being generated over the next few years.

But is that the real business issue?

Oracle Vice President Jon Mittelhauser works mostly with Fortune 1000 companies and government departments where a lot of our value add is the fast connection between customer data centres. The data can live in either place but I agree that it's the key asset.

For most companies, their data is their asset, not the software. Here in Silicon Valley, the software is highly valued, but outside of Silicon Valley it's the data, or what they do with the data, which software helps you with.

Mansour Karam sees a transition from the days when one began by partnering with hardware vendors.

Once the hardware was agreed, then one decided what software to use.

But in a software-first world, you would be limited to those software offerings that that particular hardware vendor supports. In this new world, they start by partnering strategically with software vendors to define this layer of software first, this service layer. Once they've done that, they can go on and shop for hardware that specifically meets their needs.

To sum up, Peter Burris emphasises three key points:

Original post:
Experts discuss industry response to multicloud - IT Brief New Zealand

Read More..

Data Governance And Smart Cities Are Helping Improve Quality Of Life In Japan – Forbes

In 2015, Kakogawa City had the third-worst crime rate in Hyogo, a prefecture in western Japan neighboring Osaka. Local authorities decided to implement a smart networked camera and sensor system they call mimamori, which means to watch over someone. With this, residents can monitor their children and elderly relatives. The system helps ensure their safety and security while protecting their data and privacy. Its one of the latest examples of how smart cities and data governance are helping improve society in Japan.

Fighting crime in the smart city

Located on the Seto Inland Sea about 30 km west of Kobe, Kakogawa is a city of about 264,000 people. To meet residents strong demands for safe streets, the municipal government worked with the Ministry of Internal Affairs and Communications and private businesses, including NEC and Nikken Sekkei Research Institute, to launch the mimamori system.

Municipalities in Japan and overseas are turning to smart city solutions to address social issues, prompting an explosion in data generated by cameras and other sensors.

In 2017 and 2018, the city installed about 1,500 networked cameras mainly around schools and school routes. About 2,000 sensors were also installed, both in fixed locations and on 265 government vehicles and 176 Japan Post motorcycles. The system is able to detect residents carrying Bluetooth Low Energy tags to confirm their location. The city is using FIWARE, a framework of open-source components to power smart cities and protect their data. The data is uploaded to cloud servers and the information is made available to volunteers and family members via the Kakogawa App.

The networked camera and sensor system has already had an effect. Aside from making residents feel more secure about their loved ones, the crime rate in Kakogawa fell below the Hyogo Prefecture average for the first time in November 2018.

We built an environment in which children and elderly people can be monitored by the local community, says Nishimori Yoko, a Kakogawa City official. The system can be effective in the event of an emergency. We have had several cases of missing people who were located quicker compared to before the system was deployed.

Some residents were worried about leaks of images from the system containing personal information, but the city has emphasized its policies on privacy and data governance. To assuage public concern about the new system, Kakogawa Mayor Yasuhiro Okada visited 12 sites and briefed members of the public. In a survey of 862 residents, over 98% responded that the system was necessary or probably necessary.

Kakogawa City installed about 1,500 networked cameras mainly around schools and school routes. Residents can monitor loved ones locations via a city app.

Kakogawa is working with other municipalities around the world to promote smart city policies. Representatives joined the G20 Global Smart Cities Alliance Launch Event held in Yokohama in October 2019 under the aegis of the Cabinet Office of the Government of Japan and the World Economic Forum Centre for the Fourth Industrial Revolution Japan (C4IR Japan). Participants including representatives from cities such as Barcelona and Cincinnati discussed issues including the use of technology in smart cities. Fifteen cities announced the launch of theG20 Global Smart Cities Alliance on Technology Governance, which is focused on producing standards for connected devices in public spaces.

Protecting data in Society 5.0

Theres a smart city boom emerging in Japan now, and data governance is the lifeblood of smart cities, says C4IR Japan head Suga Chizuru, who spoke at the Yokohama event. Were doing this to benefit citizens by tackling technical unattractive issues.

Representatives from smart cities around the world joined the G20 Global Smart Cities Alliance Launch Event held in Yokohama in October 2019.

The Ministry of Economy, Trade and Industry chose Suga for C4IR Japan based on her outstanding performance. At the ministry, she organized a study group aimed at modernizing Japans financial regulations amid the rise of fintech. As she gave presentations on how Japan should embrace fintech, the group gained members and attention. Eventually, it helped get legislative reforms to accelerate the adoption of fintech on the agenda in Japan. At C4IR Japan, Suga is focused on facilitating global consensus around data governance, with expert groups on healthcare, smart cities and mobility.

As more and more municipalities in Japan and overseas turn to smart city solutions to address social issues, the volume of data being generated by cameras and other kinds of sensors is seeing explosive growth. Managing that data is becoming increasingly important amid the expansion of Society 5.0, defined by the Cabinet Office as a human-centered society that balances economic advancement with the resolution of social problems by a system that highly integrates cyberspace and physical space.

Data governance is also at the heart of policies being promoted by the Japanese government and its partners. Research firm Gartner defines data governance as the specification of decision rights and an accountability framework to ensure the appropriate behavior in the valuation, creation, consumption and control of data and analytics.

Thats one of the aims of C4IR Japan, which was established in 2018 in an unprecedented partnership between the WEF, the Japanese government and Japanese organizations and corporations. Its dedicated to maximizing the benefits of the Fourth Industrial Revolution and Society 5.0, which are periods of rapid change driven by progress in science and technology, by promoting open innovation and interoperability in policymaking.

No country has the best data governance solution, and were still in an exploration phase as we share knowledge for the best governance framework, says Suga Chizuru, head of C4IR Japan.

To emphasize that data governance must be a key priority in the Fourth Industrial Revolution, the center hosted a data governance conference in November 2018. Prime Minister Abe Shinzo followed up with a speech at the Davos Forum annual conference in January 2019, announcing that his administration would prepare the Osaka track for Data Governance. G20 leaders, together with World Trade Organization Director-General Roberto Azevdo, joined Abe during the G20 Osaka summit to discuss the importance of the digital economy. They adopted the Osaka Track on the digital economy to craft rules on governance in international data traffic under the motto Data Free Flow With Trust (DFFT). The new rules are designed to benefit individuals, businesses, organizations and even smart cities like Kakogawa.

No country has the best data governance solution, and were still in an exploration phase as we share knowledge for the best governance framework, says Suga. Flexible and appropriate data governance will enable societies to enjoy the fruits of the Fourth Industrial Revolution and redistribute its wealth.

To learn more about World Economic Forum Centre for the Fourth Industrial Revolution Japan, click here.

More:
Data Governance And Smart Cities Are Helping Improve Quality Of Life In Japan - Forbes

Read More..

The Benefits of Identity Management for Healthcare Businesses – Solutions Review

Identity Management for healthcare businesses offers not only an opportunity to fortify IT infrastructures. It can also help ensure compliance, specifically with the Health Insurance Portability and Accountability Act (HIPAA), and with the user experience. Therefore, your healthcare business needs to consider the benefits of identity management.

Here, we present the three major benefits of identity management for healthcare businesses, and how it all fits together. However, first, we need to address what makes healthcare identity management so complicated and challenging.

First, healthcare perhaps more than any other industry deals with constantly expanding business lines and mergers and acquisitions. This means improved and competitive service for patients, but it also means growing attack surfaces for threat actors.

Additionally, continually growing networks as seen in healthcare results in fragmented patient data. Often, this means sensitive data may exist in unsecured databases allowing for easy theft. Additionally, fragmented patient data can lead to redundant and unnecessary care, misdiagnosis, and incorrect medication.

So the need for centralized patient data doesnt just constitute a threat to databases and network security; it can also impact patients physical safety. Moreover, identity management for healthcare faces challenges typical for other enterprises expanding their IT infrastructures. Usually, these include dealing with on-premises applications, edge devices, and new cloud applications.

Finally, healthcare organizations need to deal with the erosion of the network perimeter and medical services devices such as medical IoT. So what can identity management for healthcare businesses actually do to solve these problems?

Obviously, the first benefit of identity management centers on cybersecurity. Hackers frequently target healthcare providers in part because these businesses rarely deploy proper cybersecurity protocols; in fact, according to Armis, WannaCry continues to wreak havoc on healthcare businesses even after the devastating 2017 wave. In other words, healthcare enterprises have not adapted proper cybersecurity protocols despite the known threats targeting them.

Therefore, your healthcare business needs a next-generation solution that repels hackers and maintains consistent access rules through authentication. Strong authentication not only stops hackers, but it also deters less experienced ones by demonstrating identity management awareness. Also, authentication can fortify and monitor web applications, cloud servers, and patient portalsthe various environments in which healthcare operates.

Further, next-generation identity management for healthcare enables your business to benefit from multifactor authentication (MFA). Multifactor authentication is unquestionably the strongest form of authentication available to enterprises of all sizes.

MFA doesnt just rely on passwords to verify users; this is just as well, as passwords prove easy to circumvent, guess, or otherwise subvert. Instead, MFA uses all of the tools available to create a barrier between the users access request and the data. Factors may include hard tokens, biometrics, geofencing, time of access request monitoring, and context.

For healthcare, frequent challenges in identity and access management include lifecycle management, governance, and multiple login points. While numerous next-generation identity management capabilities can help solve these challenges, single sign-on (SSO) can certainly solve the latter. SSO helps prevent multiple log-ins and thus multiple passwords that expands the attack surface.

If you work in healthcare, you care about HIPAA. This compliance mandate focuses on patient privacy and protections; it involves not only technical safeguards for patients but physical and administrative safeguards as well. In other words, HIPAA places a significant security burden on your healthcare organization.

However, this comes with good news; HIPAA compliance helps your business tap into markets that use electronic health records; the overwhelming majority of physicians and hospitals use electronic health records.

Thankfully, next-generation identity management for healthcare businesses can help you achieve HIPAA compliance. First, authentication and access management helps ensure patients data stays secure on your networks, fulfilling part of the mandate. Second, solutions with governance capabilities often feature out-of-the-box reporting and automated forms for HIPAA compliance.

Finally, identity management can actually help make the patient-user experience and ultimately their care better. Through identity federation, disparate databases containing patient information can be centralized and secured simultaneously. This can prevent the problems with fragmented patient data and making sure the information stays out of reach of hackers.

As stated above, single sign-on also avoids the need for continual authentication and log-ins. Identity management can optimize workflows, and assist with reviewing coverage, managing claims, and scheduling appointments. In short, this level of personalization should invoke a memory of customer identity and access management (CIAM); it helps to maintain consistent patient data and makes those patients feel individual and appreciated.

Check out our Identity Management Buyers Guide. We cover the top solution providers, their use cases, and key capabilities in detail.

Ben Canner is an enterprise technology writer and analyst covering Identity Management, SIEM, Endpoint Protection, and Cybersecurity writ large. He holds a Bachelor of Arts Degree in English from Clark University in Worcester, MA. He previously worked as a corporate blogger and ghost writer. You can reach him via Twitter and LinkedIn.

Related

See the original post here:
The Benefits of Identity Management for Healthcare Businesses - Solutions Review

Read More..

Suspect Wanted in May Shooting Arrested in St. Cloud – WJON News

ST. CLOUD --The St. Cloud Police Department has arrested a man wanted in a home invasion in eastern Minnesota last May.

St. Cloud Assistant Police Chief Jeff Oxton said 19-year-old Nicholas James of St. Cloud was arrested on a warrant out of Chisago County just after 1:30 a.m. Wednesday. Oxton says James was arrested in the 1100 block of 13th Street South without incident.

James was wanted on robbery and attempted murder charges.

According to the Chisago County Sheriff's Office, three suspects broke into a home back on May 16th. Authorities say a 22-year-old victim was shot and wounded. The victim was airlifted to a Twin Cities hospital.

The other two suspects were brought into police custody shortly after the incident. The sheriff's office says the suspects and victim knew each other and do not believe the incident was random.

Have WJON News Sent to Your InboxSubscribe to the WJON Newsletter and have top local news headlines sent to your inbox every day.

See the original post here:
Suspect Wanted in May Shooting Arrested in St. Cloud - WJON News

Read More..

How the cloud went from 0 to 100 in ten years – Express Computer

Synergy Research Groups detailed review of enterprise IT spending over the last ten years shows that annual spending on cloud infrastructure services has gone from virtually zero to almost $100 billion. Meanwhile enterprise spending on data center hardware and software has been stagnant through much of the decade. Data center spending did jump in 2018 even though server unit shipments remained flat, thanks to more richly configured and higher-priced servers.

Despite that 2018 increase in data center spending, growth in cloud spending did not miss a beat in 2018 and then grew again by almost 40% in 2019. Over the whole decade, average annual spending growth for data center was 4% (mostly due to the first three years) and for cloud services was 56%. 2019 will mark the first time that enterprises spend more on cloud services (IaaS, PaaS and hosted private cloud) than they do on data center equipment.

Based on actual spending in Q1-Q3 and its forecast for Q4, Synergy projects that 2019 worldwide spending on data center hardware and software (comprising servers, storage, networking, security and associated software) will be over $93 billion. The major segments with the highest growth rates over the decade were virtualization software, Ethernet switches and network security. Server share of the total data center market remained steady while storage share declined. Synergy projects that 2019 worldwide spending on cloud infrastructure services will reach $97 billion. The major segments with the highest growth rates over the decade were mainly within PaaS especially database, IoT and analytics. IaaS share of the total held reasonably steady while managed private cloud service share declined somewhat.

The decade has seen a dramatic increase in computer capabilities, increasingly sophisticated enterprise applications and an explosion in the amount of data being generated and processed, pointing to an ever-growing need for data center capacity. However, over half of the servers now being sold are going into cloud providers data centers and not those of enterprises, said John Dinsdale, a Chief Analyst at Synergy Research Group. Over the last ten years we have seen a remarkable transformation in the IT market. Enterprises are now spending almost $200 billion per year on buying or accessing data center facilities, but cloud providers have become the main beneficiaries of that spending.

If you have an interesting article / experience / case study to share, please get in touch with us at [emailprotected]

Read the original post:
How the cloud went from 0 to 100 in ten years - Express Computer

Read More..

The Evolution of Bitcoin’s Technology Stack – Cointelegraph

Over the last 10 years, the Bitcoin ecosystem has attracted developers to dedicate thousands of hours to improve and revamp most of its underlying codebase. Yet, Bitcoin (BTC) is largely the same. The reason for this is that its core set of consensus rules that define its monetary properties, such as its algorithmic inflation and hard-coded supply, remain unchanged.

Time and time again, factions have attempted to change these core properties, but all hostile takeovers thus far have failed. Its often a painful process but one that highlights and solidifies two of Bitcoins biggest virtues: No single party can dictate how Bitcoin evolves; and the absence of centralized control protects Bitcoins monetary properties.

The values that make Bitcoin a popular phenomenon are also those that make developing software atop Bitcoin more challenging than any other digital asset. Developers are limited to what theyre able to transform in order to not undermine its apparatus as a store of value.

Nonetheless, as well see from the examples below, innovation in Bitcoin is possible. It requires creativity and patience.

Since changing Bitcoins core layer requires a quasi-political process that may infringe upon its monetary properties, innovation is often implemented as modules. This development is similar to that of the internet's protocol suite, where layers of different protocols specialize in specific functions. Emails were handled by SMTP, files by FTP, web pages by HTTP, user addressing by IP and packet routing by TCP. Each of these protocols has evolved over time to create the experience we have today.

Spencer Bogart of Blockchain Capital has captured this development succinctly: We are now witnessing the beginning of Bitcoins own protocol suite. The inflexibility of Bitcoins core layer has birthed several additional protocols that specialize in various applications, like Lightnings BOLT standard for payment channels. Innovation is both vibrant and relatively safe, as this layered approach minimizes potential risks.

The diagram below is an attempt to map all relatively new initiatives and showcases a more complete representation of Bitcoins technology stack. It is not exhaustive and does not signal any endorsement for specific initiatives. It is, nevertheless, impressive to see that innovation being pushed on all fronts from Layer 2 technologies to emerging smart contract solutions.

There has been a lot of talk lately about the rate of adoption of the Lightning Network, Bitcoins most prominent Layer 2 technology. Critics often point to an apparent decline in the number of channels and total BTC locked when evaluating Lightnings user adoption. Yet, these metrics arent the most definitive measurement of adoption.

Related: What Is Lightning Network And How It Works

One of the most underrated virtues of the Lightning Network is its straightforward privacy properties. Since Lightning does not rely on global state reconciliation i.e., its own blockchain users can transact privately over using additional techniques and network overlays, like Tor. Activity happening within private channels is not captured by popular Lightning explorers. As such, an increase in private usage of Lightning has resulted in a decrease in what can be publicly measured, leading observers to erroneously conclude that adoption is down. While it is true that Lightning must overcome substantial usability barriers before it can enjoy wide adoption, using misleading metrics to make assertions about the current state of the network serves few.

Another recent development in the field of Layer 2 privacy was the creation of WhatSat, a private messaging system atop Lightning. This project is a modification of the Lightning Network Daemon (LND) that allows the relayers of private messages, who connect the entities communicating, to be compensated for their services via micropayments. This decentralized, censorship-and-spam-resistant chat was enabled by innovations in the LND itself, such as recent improvements in the lightning-onion, Lightnings own onion routing protocol.

There are several other projects leveraging Lightnings private micropayment capabilities for numerous applications from a Lightning-powered cloud computing VPS to an image hosting service that shares ad revenue via microtransactions. More generally, we define Layer 2 as a suite of applications that can use Bitcoins base layer as a court where exogenous events are reconciled and disputes are settled. As such, the theme of data anchoring on Bitcoins blockchain goes beyond Lightning, with companies like Microsoft pioneering a decentralized ID system atop Bitcoin.

There are projects attempting to bring back expressive smart contract functionality to Bitcoin in a safe and responsible way. This is a significant development because, starting in 2010, several of the original Bitcoin opcodes the operations that determine what Bitcoin is able to compute were removed from the protocol. This came after a series of bugs were revealed, which led Satoshi to disable some of the functionality of Script, Bitcoins programming language.

Over the years, it became clear that there are non-trivial security risks that accompany highly-expressive smart contracts. The common rule of thumb is that the more functionality is introduced to a virtual machine the collective verification mechanism that processes opcodes the more unpredictable its programs will be. More recently, however, we have seen new approaches to smart contract architecture that can minimize unpredictability and also provide vast functionality.

The devise of a new approach to Bitcoin smart contracts called Merklized Abstract Syntax Trees (MAST) has since triggered a new wave of supporting technologies for Bitcoin smart contracts. Taproot is one of the most prominent implementations of the MAST structure that enables an entire application to be expressed as a Merkle Tree, whereby each branch of the tree represents a different execution outcome.

Another interesting innovation that has recently resurfaced is a new architecture for the implementation of covenants, or spend conditions, on Bitcoin transactions. Originally proposed as a thought experiment by Greg Maxwell back in 2013, covenants are an approach to limit the way balances can be spent, even as their custody changes. Although the idea has existed for nearly six years, covenants were impractical to be implemented before the advent of Taproot. Currently, a new opcode called OP_CHECKTEMPLATEVERIFY formerly known as OP_SECURETHEBAG is leveraging this new technology to potentially enable covenants to be safely implemented in Bitcoin.

At first glance, covenants are incredibly useful in the context of lending and perhaps Bitcoin-based derivatives as they enable the creation of policies, like clawbacks, to be implemented on specific BTC balances. But their potential impact on the usability of Bitcoin goes vastly beyond lending. Covenants can allow for the implementation of things like Bitcoin Vaults, which, in the context of custody, provide the equivalent of a second private key that allows someone that has been hacked to freeze stolen funds.

In essence, Schnorr signatures are the technological primitive that make all of these new approaches to smart contracts possible. And there are even edgier techniques being currently theorized, such as Scriptless Scripts, which could enable fully private and scalable Bitcoin smart contracts to be represented as digital signatures as opposed to opcodes. These new approaches may enable novel smart contract applications to be built atop Bitcoin.

There have also been some interesting developments in mining protocols, especially those used by mining pool constituents. Even though the issue of centralization in Bitcoin mining is often wildly exaggerated, it is true that there are power structures retained by mining pool operators that can be further decentralized.

Namely, pool operators can decide what transactions will be mined by all pool constituents, which grants them considerable power. Over time, some operators have abused this power by censoring transactions, mining empty blocks and reallocating hashing without the authorization of constituents.

Changes to mining protocols have aimed to subvert the control that mining pool operators can have on deciding what transactions are mined. One of the most substantial changes coming to Bitcoin mining is the second version of Stratum, the most popular protocol used in mining pools. Stratum V2 is a complete overhaul that implements BetterHash, a secondary protocol that enables mining pool constituents to decide the composition of the block they will mine not the other way around.

Another development that should contribute to more stability is reignited interest in hash rates and difficulty derivatives. These can be particularly useful for mining operations that wish to hedge against hash rate fluctuations and difficulty readjustments.

Contrary to some arguments out there, there are a host of emerging protocols that can bring optional privacy into Bitcoin. That being said, it is likely that privacy in Bitcoin will continue to be more of an art than a science for years to come.

More generally, the biggest impediment to private transactions across digital assets is that most solutions are half-baked. Privacy assets that focus on transaction-graph privacy often neglect network-level privacy, and vice versa. Both vectors suffer from a lack of maturity and usage, which makes transactions easier to de-shield via statistical traceability analysis at either the peer-to-peer (P2P) network layer or the blockchain layer.

Thankfully, there are several projects that are pushing boundaries on both fronts.

When it comes to transaction-graph privacy, solutions like P2EP and CheckTemplateVerify are interesting because privacy becomes a by-product of efficiency. As novel approaches to CoinJoin, these solutions can increase the adoption of private transactions by users that are solely motivated by lower transaction fees. As CoinJoins, their privacy guarantees are still suboptimal, but unshielded sent amounts can be beneficial, as they preserve the auditability of Bitcoins supply.

If lower transaction fees become a motivator and lead to an increase in Bitcoins anonymity set the percentage of UTXOs that are CoinJoin outputs de-anonymization via statistical analysis will be even more subjective than it already is.

There has also been considerable progress in the privacy of P2P communications, with protocols like Dandelion being tested across crypto networks. Another notable development is Erlay, an alternative transaction relay protocol that increases the efficiency of private communications and reduces the overhead of running a node. Erlay is an important improvement since its efficiency gains enable more users to more easily complete IBD and continuously validate the chain, especially in countries where ISPs impose caps on bandwidth.

These examples are only a handful of initiatives in play to transform the Bitcoin framework. Bitcoin, in its totality, is a constantly evolving suite of protocols.

While evolution within a relatively strict set of rules and values can be challenging for developers, the layered approach that weve seen unfold is what makes gradual, effective change possible. Minimizing politicism within Bitcoin and protecting its fundamental monetary properties are necessary parts of the process. Developers are learning how to work within these bounds in a meaningful fashion.

The views, thoughts and opinions expressed here are the authors alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.

Lucas Nuzzi, director of technology of Digital Asset Research. He heads up DARs research arm, developing original reports and insights on all areas of the cryptocurrency ecosystem. Widely regarded throughout the digital asset community as an expert on blockchain and distributed systems, Lucas has contributed to several major publications. Prior to co-founding DAR in 2017, he was a blockchain researcher and consultant for a handful of years.

Follow this link:
The Evolution of Bitcoin's Technology Stack - Cointelegraph

Read More..