Page 2,927«..1020..2,9262,9272,9282,929..2,9402,950..»

Welcome to the age of Cloud 2.0 and what it can offer your agency – Federal News Network

Federal agencies are some dozen years into concerted efforts at cloud computing adoption. Policy has evolved from the initial cloud first to the current cloud smart. And the commercial cloud services industry has also come a long way, as providers have grown from simply infrastructure hosts.

What that means is that there are a lot more options, a lot more services, a lot more capability today than we saw only a few years ago, said David Knox, group vice president and public sector chief technology officer at Oracle.

Knox said agency IT managers need to revisit that and think about how they can take advantage of those options. And ask, Is their first initial goal with the cloud, in fact, achieving what it is that they want? How do they incorporate those new capabilities into what theyre doing?

Cloud success has three important components, according to Chris Pasternak, managing director and North America Oracle lead at Accenture.

First is simply having a clear definition of what you mean by success. Lowered cost is an obvious metric, but there are other things than costs, Pasternak said. Those include seemingly intangibles like job satisfaction for the IT staff. How do your end users feel? Have you improved their experience?

Second, Pasternak said, you must have a clear technical path, including the understanding that legacy applications will need re-architecting and code factoring. You cant just pick them up and move them.

Third factor in success consists of continually monitoring performance, keeping up with new services and keeping on top of both service level agreements and cost expectations.

Youve got to move your mentality away from legacy, into this new operating model, this new thinking, Pasternak said. That in turn requires updating governance strategies, fine-tuning data egress and other operations to control costs, and figuring out how to provide greater autonomy to developers, systems administrators and others who are dealing not with an agency data center but with a cloud services provider.

Also important: not forgetting the agility and scalability that formed the case for cloud computing in the first place.

As Knox explained, yes, data egress costs can present a scary picture. But that must be balanced against even small increments in, say, response time. Such increments multiply to vastly increased capacity to handle program demand peaks. What might have jammed a data center mainframe to failure can be answered by instantly spinning up a new server instance in a cloud.

He added that its also important to factor in the increased security that comes with the major cloud providers.

The cloud service providers have to put together a very robust infrastructure. Theyve hired the worlds best computer security technologists, network security people. And that is inherent, Knox said. In fact, cloud customers, simply by hosting there have to take advantage of the security thats there.

Updating cloud strategies becomes especially important in the government-wide drive to modernize its IT. The Biden administration has requested $9 billion for the Technology Modernization Fund.

Pasternak said legacy and other mission applications have only moved to the cloud in recent years. He advised using that experience to inform future migrations.

For example, some agencies are using cloud hosted financial management applications that are both critical and complicated. When planning for the next mission critical app, he said, use it as a stepping stone. Ask, how do I then further my agenda by stop customizing the app, and build using cloud native services bolted on to the application? The final step becomes a matter of re-directing user traffic to the cloud. That makes things so much easier when you start to think about your transition.

Cloud has continued to mature. That means a lot more options, a lot more services, a lot more capability today than we saw only a few years ago. People need to revisit that and think about how can they take advantage of those, how do they incorporate those new capabilities into what they're doing.

Group Vice President and Chief Technology Officer, Public Sector, Oracle

If I want to move to cloud now, use [earlier applications] as a stepping stone. Ask, what things can I do to put [the next] application on cloud? How do I then further my agenda by stop customizing the app, and build using cloud native services bolted on to the application? Because then I can just re-point [users]. That makes things so much easier to start to think about your cloud transition.

Managing Director and North America Oracle Lead, Accenture

Listen to the full show:

Read more from the original source:
Welcome to the age of Cloud 2.0 and what it can offer your agency - Federal News Network

Read More..

Innovative Hires Top Cloud Exec to Accelerate Growth with AWS – PRNewswire

Innovative Solutions, an AWS Premier Consulting Partner, has appointed Jeff Valentine as President and COO.

Innovative has been serving the technology needs of businesses for more than 30 years and has experienced exponential growth since 2019 as it has expanded its managed cloud services offering into over 240 cities and 3 countries. Valentine's passion for growth paired with Innovative's team of AWS-certified cloud engineers will continue to enable businesses to modernize their operations using the cloud.

"Businesses everywhere are realizing that moving more of their applications and operations to the cloud helps them meet their goals," said Valentine. "I'm excited because Innovative helps businesses accelerate their digital transformations with our army of cloud experts, our technology, and our best practices."

Valentine is a seasoned growth leader with over 20 years of experience in starting, growing, and investing in technology companies. Most recently, Valentine spent three years as CTO of one of the leading providers of cloud insights software that accelerates digital transformations. Valentine served as CTO, CPO, CMO, and CEO at various technology companies since the late 1990s.

About Innovative SolutionsInnovative believes that every company will become a technology company, and we're here to help. Recognized as a premier AWS consulting partner, Innovative uses its cloud expertise and technology to help customers move to and manage cloud workloads.With an army of cloud experts and the Innovative Cloud Runbook utilizing the leading cloud technologies, Innovative gives businesses of every size the confidence to grow in the cloud with Well Architected reviews, cloud migrations, AWS cloud hosting and resale, managed cloud services, application modernization, cloud-native software development, cloud cost optimization, cloud security monitoring, and virtual CIO and CISO consulting. For more information or for a free cloud readiness assessment, visit http://www.innovativesol.comor follow us on LinkedIn.

SOURCE Innovative Solutions

http://www.innovativesol.com

Visit link:
Innovative Hires Top Cloud Exec to Accelerate Growth with AWS - PRNewswire

Read More..

Commencing a digital transformation journey with application modernization – Web Hosting | Cloud Computing | Datacenter | Domain News – Daily Host…

The automated process has been tailored to ensure that the ever-widening gap between the development and operations can be bridged by picking professional devops consulting services. The comprehensive approaches help in accelerating the efficiency while vouching for value addition to your company. Let us now have a look at the hand-crafted listicle of benefits catered by the devops services companies:

Data security :Security bricks needs to strengthen at every step of application rebuilding. Especially, when you pick options for revaluation. Security building needs layering in every minute process for a well-augmented solution. Thus, application modernization promises to eradicate the hurdles of data security. Data security is one of the greatest indicators of utmost stability.

Business agility to expand horizons :The new wave of innovation cannot pass through the rigid application architecture. In no small measure, this limits business agility. Thus, it is significant to pick legacy application modernization. This ensures that the core system can adapt to the long list of modern problems. It is time to open up the gates for flooding innovations and expand horizons.

Building cornerstone of satisfying customer experiences :The optimal transition from an oldmonolith to a cloud platform isnt a cakewalk. It definitely comes with a lot of expertise. You can understand that you have tapped the potential benefits,if:

Opening taps of revenue streams: Legacy application modernization is all about unlocking therevenue streams with utmost flexibility.With an established updated system, you can add scores ofcustomers in the loop of enhanced services, bringing innovation to the table. Furthermore, the advancement caters operational excellence. It also vouches for enhanced agility to lower the maintenance costs.

Painting a picture of digital transformation:The business landscape needs moderation. In the era pacing towards digital advancement, there is a need for transformation. If you want your business to hold the placard of Committed to the future then it is time for you to resort to the digital transformation.

In a nutshell, the legacy applications need modernization and immense expertise. Atiauro, we aim to bring innovation to the table to mark a green flag of advancement. We believe in never resting our laurels!

With our modernization of application, we unleash the powers with brilliant technical heads. Our culture is based on the foundation to address your complex problems. We perform our operations with dedication and compassion. Reach out to us, today. Were merely one tap away to bring you the application modernization services.

Continue reading here:
Commencing a digital transformation journey with application modernization - Web Hosting | Cloud Computing | Datacenter | Domain News - Daily Host...

Read More..

The Pros and Cons of Kubernetes-Based Hybrid Cloud – Data Center Knowledge

Hybrid cloud platforms increasingly fall into one of two broad categories: those that are based on Kubernetes and those that arent. So thats one of the first fundamental decisions you now have to make when building an architecture that integrates on-premises or colocated infrastructure with a public cloud.

Kubernetes, the open source container orchestrator, is much more than a hybrid cloud platform, of course. Its a way to deploy applications -- especially, but not necessarily, those that run in containers -- on any on-prem or public cloud infrastructure or combination thereof. Supporting hybrid cloud architectures is not even a primary focus of the Kubernetes project.

Related: VMware Propagates Its Kubernetes Solution Tanzu Across Hyperscale Clouds

Nonetheless, Kubernetes provides a key benefit for hybrid deployments. It offers a uniform way to deploy and manage applications no matter which infrastructure they run on. It does this by abstracting the underlying infrastructure from the application environment. When you deploy an application on Kubernetes, the process is basically the same whether youre doing it in a public cloud, a colocation data center, or even a spare laptop that you use for testing.

And, because Kubernetes can manage application environments that span multiple types of infrastructures at once, it provides a consistent deployment and management experience even if some of your servers and applications are running in a public cloud and others are running on-premises or in a colocation facility.

Related: What Colocation Users Should Know About AWSs and Its Rivals Hybrid Cloud Solutions

Realizing this, some vendors over the past few years have taken a Kubernetes-first approach to hybrid cloud. Google Anthos, which uses Google Kubernetes Engine to manage clusters running in any public cloud or private data center, is probably the most prominent example. VMwares Tanzu platform is another.

AWSs EKS Anywhere, which can manage on-prem clusters (and potentially those running in other public clouds) through Amazons Elastic Kubernetes Service, also qualifies as a hybrid cloud platform of sorts. Its not Amazons main hybrid solution -- thats AWS Outposts, which provides a broader set of hybrid services -- but to the extent that EKS Anywhere supports the deployment of containerized applications that span multiple hosting environments, it fits the hybrid cloud bill.

The list of Kubernetes-based hybrid platforms stops there. The other major hybrid solutions, including AWS Outposts, Azure Stack, and Azure Arc, use other technologies as the basis for hybrid cloud management. They also all happen to support Kubernetes deployments via a hybrid architecture, but they dont use Kubernetes as the management layer for the underlying hybrid environment.

Is one approach to hybrid cloud better than another? That depends on a few variables.

The most important is whether you like managing workloads via Kubernetes more than managing them through a public clouds standard tooling. Platforms like Anthos and Tanzu use Kubernetes to orchestrate everything, whereas solutions like Outposts and Azure Stack use the native management tooling (CloudWatch, CloudTrail, CloudFormation, and so on) for application deployment and administration. If you prefer the Kubernetes approach to application deployment and management, then, a Kubernetes-based hybrid cloud platform may be right for you.

A second factor to consider is the extent to which your applications are containerized. Kubernetes can manage virtual machines as well as containers, and indeed, VM orchestration is a first-class feature in both Tanzu and Anthos. But at the end of the day, it may feel strange to manage VMs inside Kubernetes, which is designed first and foremost to orchestrate containers. VMs dont typically start and stop as fast as containers, and its rare to launch multiple VM instances in the way you would for containers. If your workloads consist mostly of VMs, you might be better served by a hybrid cloud platform that doesnt revolve around Kubernetes.

Its worth considering, too, whether you think Kubernetes is going to stick around for the long haul. The platform is massively popular today (which is part of the reason why Google and VMware have chosen it as the basis for their hybrid strategies), but its also only seven years old. Its not entirely crazy to think that Kubernetes could turn out to be more of a fad than a longstanding technology staple.

After all, five or six years ago, when Kubernetes was just an upstart project whose name no one could pronounce, it looked as if Docker was going to rule the world forever, and marrying your tooling to Docker seemed a safe bet. We now know how that turned out.

Committing to a Kubernetes-based hybrid platform, then, could be like going all-in on Mesosphere circa 2015: It will work as long as the hype lasts, but you may have to rebuild everything when the fad fades.

Flexibility is a final factor to consider. Generally speaking, Kubernetes-based hybrid clouds are more flexible than those that depend on a cloud vendors proprietary tooling. If you use Azure Stack, for instance, its going to be hard to migrate to AWS Outposts, because the migration would basically be the equivalent of moving from Azure itself to AWS. But migrating from Anthos to Tanzu would be easier -- though not seamless -- because both platforms are founded on Kubernetes.

There are solid reasons to choose Kubernetes as the basis for a hybrid cloud strategy. There are also some good reasons to select a platform that doesnt require Kubernetes tooling and that supports more types of workloads than Kubernetes can manage.

View original post here:
The Pros and Cons of Kubernetes-Based Hybrid Cloud - Data Center Knowledge

Read More..

WHAT IS THE PRIVACY ARGUMENT BETWEEN FACEBOOK AND APPLE? – Web Hosting | Cloud Computing | Datacenter | Domain News – Daily Host News

It is a widely-known fact that information is power. Those who have your data can use it to track you, reach you, understand your likes and dislikes, etc. In todays digital world, all brands get information about their customers very easily. In recent times, there has been a lot of conversation about users privacy and how much information different platforms / portals should have. At the centre of this heated argument have been Facebook and Apple both with contrasting opinions.

Why does this concern you?

Because they are discussing whether YOUR information should be in the public domain or not!

So what are they saying?

According to the new improvement / update in iOS, any websites / portals / developers are supposed to ask your permission before taking your information. It is to protect your privacy and maintain transparency. This means that they cannot perform targeted marketing basis the choices you make, which is Facebooks proposition. Facebook currently takes a lot of user data, from both first and third parties so that they can utilise and monetize the data by creating target profiles which can be used by brands. Over time, Facebook has been expanding its boundaries to take more information with every click, every video you watch and every word you speak.

This means that every activity you perform on your phone, be it work or pleasure, banking or social media, research or binge-watching, the internet can keep track of everything. With mobile phones playing an important role in both personal and professional lives, bank details, confidential information, etc., is all available easily and having that leaked can be a big risk.

However, what we need to understand is that Apple is not saying it will not take data to do target advertising. It does not intend to ban advertising on its platforms. What they propose is simply that users should have the transparency and the choice to decide whether they want their information to be available to other platforms or not. The choice will allow them to know what information is being shared and what the platforms can use it for.

According to Facebook, giving this option and empowerment to users can impact the platforms business in a massive way. Apart from that, it can also affect the way small businesses sell themselves because not everyone can afford big banners or advertisements. Social media marketing is affordable, effective and gives immense ROI. All these qualities can be attributed to companies like Facebook who collect a whole lot of data from their users and then offer it to these businesses. They can then use this information to pinpoint their target market and align their finances to reach out to them in the most effective way possible.

With the immense chatter around this topic, it is getting increasingly difficult to understand which is the more beneficial option. What do we choose between being offered better marketing and information breach? With time, we will find out.

The post WHAT IS THE PRIVACY ARGUMENT BETWEEN FACEBOOK AND APPLE? appeared first on NASSCOM Community |The Official Community of Indian IT Industry.

See more here:
WHAT IS THE PRIVACY ARGUMENT BETWEEN FACEBOOK AND APPLE? - Web Hosting | Cloud Computing | Datacenter | Domain News - Daily Host News

Read More..

DXC Technology Signs Agreement with Temenos, Enabling its Large Bank Customers to Reimagine Core Banking Transformation – Business Wire

GENEVA & TYSONS, Va.--(BUSINESS WIRE)--DXC Technology (NYSE: DXC) today announced that it has signed a strategic agreement with Temenos (SIX: TEMN), the banking software company to accelerate the digital transformation strategy for DXCs large bank customers. The expanded partnership brings together the extensive cloud hosting, implementation and integration strengths of DXC with the power of Temenos industry leading banking software. The two companies will jointly offer large bank customers the optimal modernization approach to address competitive, regulatory, cost and innovation drivers.

Many large banks are having to contend with complex legacy technology stacks that can include multiple core capabilities, disparate systems and product silos. These banks are embarking on transformation projects to compete with fintechs and neobanks, whose agility allows them to rapidly launch differentiated products and attract new customers. DXC and Temenos are paving the transformation journey by empowering the banks to compete with challengers by quickly adapting their business models and offering differentiated services to their customers.

With the combined strengths of DXC and Temenos, customers can adopt a modern core banking solution, confident in their migration success and benefits of adopting a fully hosted and resilient solution.

Dmitry Loschinin, Executive Vice President, DXC Luxoft Analytics & Engineering: We are excited to strengthen our collaboration with Temenos, the market-leading, cloud-native, banking-technology provider. Core IT systems play a central role in helping banks innovate and deliver next-generation banking services to their customers. And like any major IT transformation, this kind of enterprise-level change needs to be undertaken with the right partners. Together with Temenos, DXCs world-class professional services and deep banking expertise will empower banking customers to begin their IT modernization journey, focusing on their core business strengths while we address the implementation.

Max Chuard, Chief Executive Officer, Temenos said: We are delighted to announce this strategic agreement with DXC, a proven partner for delivering strategic transformation of complex, mission-critical IT systems for financial services firms. This joint go-to-market strategy with DXC is a new channel to market for Temenos, which will accelerate our penetration in the large banks segment, notably with the U.S. market, representing approximately 60% of the total third-party market spend. Together, we seek to help larger banks with complex, legacy IT architectures break down silos, reduce IT complexity and costs, and gain greater speed to market. We can help DXCs customers accelerate their business transformation and provide outstanding customer experiences.

Ends

About DXC Technology

DXC Technology (NYSE: DXC) helps global companies run their mission critical systems and operations while modernizing IT, optimizing data architectures, and ensuring security and scalability across public, private and hybrid clouds. With decades of driving innovation, the worlds largest companies trust DXC to deploy the Enterprise Technology Stack to deliver new levels of performance, competitiveness and customer experiences. Learn more about the DXC story and our focus on people, customers and operational execution at http://www.dxc.technology.

About Temenos

Temenos AG (SIX: TEMN) is the worlds leader in banking software. Over 3,000 banks across the globe, including 41 of the top 50 banks, rely on Temenos to process both the daily transactions and client interactions of more than 1.2 billion banking customers. Temenos offers cloud-native, cloud-agnostic and AI-driven front office, core banking, payments and fund administration software enabling banks to deliver frictionless, omnichannel customer experiences and gain operational excellence.

Temenos software is proven to enable its top-performing clients to achieve cost-income ratios of 26.8% half the industry average and returns on equity of 29%, three times the industry average. These clients also invest 51% of their IT budget on growth and innovation versus maintenance, which is double the industry average, proving the banks IT investment is adding tangible value to their business.

For more information, please visit http://www.temenos.com.

Read the rest here:
DXC Technology Signs Agreement with Temenos, Enabling its Large Bank Customers to Reimagine Core Banking Transformation - Business Wire

Read More..

South Africa: our heads are finally in the cloud – IT-Online

Most strategic discussions in boardrooms around the world will include cloud based offerings because the services offered have become fundamental to a modern, resilient enterprise.

Trent Odgers, cloud hosting manager at Veeam

Last year many organisations may well have been taking a wait-and-see approach, but the pandemic has forced everyones hand and sped up digital transformation in all its guises.

Cloud computing is believed to have been invented by Joseph Carl Robnett Licklider in the 1960s with the idea of being able to connect people and data from anywhere at any time. Fast-forward 60 years and the various lockdowns around the world have made this idea an absolute and immediate necessity.

There have been winners and sadly there were losers during the pandemic. Some businesses thrived and others paid the ultimate price. The key criteria for many businesses success was their ability to pivot and adapt to new ways of working.

In many ways, we have entered the age of the pivot: large, complex organisations have to make rapid, if not radical, changes fast in order to survive and adapt to changing needs. While this has always been important to prevent being disrupted, today it is an existential consideration.

The cloud enables a business to pivot faster with less stress. You no longer need to be the guru, or hire the most expensive IT staff.

Rather, choose a reputable partner and give them your requirements. This is particularly true for many industries from financial services, automotive, healthcare and manufacturing, to small and medium sized industries, where there is a high cost of managing their own data and security via costly data centres, so offloading this and adopting cloud services becomes more effective for their business.

Without a doubt, this is one of the main reasons there is such a positive attitude in South Africa toward cloud-based services. Some of the biggest attractions of these services is their ease of use and consumption-based model.

While many South African enterprises have a cloud-first strategy, the most accurate description of the local market is that it is a hybrid-cloud environment. At its most basic, this means that there is a strategic mix of services spread across on-premises, private cloud, managed cloud via local cloud providers and public cloud providers.

All of these services that are being consumed are generating data and that data needs to be backed up, recoverable, secure and compliant. If the service is disrupted, it needs to be restored rapidly, because downtime or breaches carry a large financial and reputational risk for an organisation.

Many words have been written about shifting workforces off-site to comply with government restrictions. Beyond the cybersecurity threats, remote working has meant there is more data at the edge than in January, simply because many people had to take their devices home, or use their personal devices for work.

It is obvious, then, to see why it is non-negotiable for organisations to find partners that are able to manage this data in an integrated and compliant manner.

South Africa is going to feel the full weight of compliance in 2021 when the Protection of Personal Information (POPI) Act comes into force. No longer a concept, compliance now has teeth. The complexities of managing so much personal information are huge, meaning compliance can no longer be an add-on, it has to be front and centre of digital transformation strategies.

SA is on par with the best

Cloud services that are offered by, both public and private organisations in South Africa, are world class. Its not uniquely a South African trait, but we tend to be our own biggest critics. We see first-hand that South African offerings in our partner network are delivering managed services across Africa, Europe and the US with great success.

Companies such as AWS and Microsoft, have already made significant investments in Africa that improve the standard across all elements of cloud-based services, as they require a very high standard which has a positive impact on the entire industry and country. We will continue to see this trend gain more momentum into 2021 as the growth of these services are growing exponentially.

As connectivity and electricity stability improves, the potential for the cloud to change the country and continent is limited only by imagination. With a high-end smartphone, relevant sensors and a fast, stable internet connection, telemedicine could change the face of our continents health systems forever.

Education could also be drastically transformed. Harnessing the power of the cloud could bring world-class education to entire communities no matter where they are and change their current and future prospects.

Of course, this country and continent has unique challenges, but as improving connectivity unlocks the full power of the cloud, we could be unlocking new opportunities and entire new industries.

What is abundantly clear, though, is that Africas time in the cloud is well and truly here. In South Africa, this is being embraced which bodes well for our global competitiveness and prospects.

The industry on these shores is maturing, and with proper Cloud Data Management, organisations can ensure their data is secure and compliant with little to no down time. The world will continue changing at breakneck speed thanks to the vision of Licklider all those years ago organisations need to change with it.

Related

Read the original post:
South Africa: our heads are finally in the cloud - IT-Online

Read More..

NASSCOM Feedback on the Draft National Blockchain Strategy – Web Hosting | Cloud Computing | Datacenter | Domain News – Daily Host News

[This blog has been co-authored with Indrajeet Sircar]

The Ministry of Electronics and Information Technology (MeitY) has put together a very comprehensive and forward-looking draft of the National Blockchain Strategy (Draft Strategy). The industry response around the document has been highly positive, and the industry is keen to collaborate with the Government in seizing the opportunity presented by distributed ledger technologies (DLT).

The industry has already started piloting various use-cases of DLT, with several industry members viewing DLT as a technology that is likely to have wide-ranging mainstream applications within an 18-month horizon. Accordingly, to gauge the industrys feedback, we reached out to several domestic and international players, particularly in the Banking and Financial Services Industry (BFSI) sector, where early adoption of blockchain has been prominent.The feedback received from the industry, is a collation of early-stage learnings that members have gathered from their initial pilot efforts, and the recommendations in the present submission are informed by these learnings. We also received some very helpful feedback from the industry in terms of the practical issues that might be faced in the eventual implementation of the framework, and on the potential use-cases that could be prioritised.

Highlights of the Submission:

I. Shortlisting Appropriate Use-Cases

At present, the Draft Strategy lays out several use-cases of DLT that could be prioritised. However, not all such use-cases would demonstrate optimal usage of the technology.Use-cases such as management of birth and death certificates (listed at Page 8 of the Draft Strategy), would not be as optimal a use-case for DLT, since the information would not be altered through the course of multiple transactions. By contrast, a more appropriate use-case such could be land record management, where multiple transactions occur, giving rise to the need to reconcile multiple dynamic data attributes such as title, nature of rights provided (lease/freehold, etc.), and geographical boundaries of land parcels, thereby providing an appropriate test-case for DLT.In this context, the MeitY could consider including a use-case validation framework, to enable the identification of solutions which should be implemented over DLT or a the Unified Blockchain Framework.

II. Role of the Government

At present, the role to be played by the Government towards achieving the stated objectives of the strategy are unclear. Therefore, one of the primary and consistent feedback points received from the industry, is the need for additional clarity over the role of the Government.

Governments Role as Ecosystem Developer-In its ecosystem development role, the Government should consider closer coordination with various State Governments on their blockchain related efforts, and proactive engagement with top global and Indian companies and academic institutions working on blockchain technologies with a view to help develop blockchain ecosystem in India.

Governments Role as Regulator-In its role as regulator, the Government should work closely with international counterparts and technology leaders, to formulate Common and Open Global Standards, together with sector specific standards required for the ecosystem in India.

III. Addressing Tokenisation

The scope of the Draft Strategy is largely limited to DLT and does not deal with tokenisation. However, the industry believes that tokenisation could play a significant role in incentivising ecosystem participation. Tokenisation can enable easier ways for organisations to host nodes and DAPPs.

The feedback received from the BFSI industry segment, highlights several potential applications of tokenisation such as:

The Draft Strategy could consider exploring the potential benefits and risks associated with tokenisation. In particular, the Draft Strategy should explore the potential use-cases for exchange and utility tokens (i.e. not currency or asset-referenced tokens) that are unlikely to give rise to systemic risks.

IV. Data Privacy and Blockchain

The Draft Strategy should recognise the importance of sector specific adoption strategies in blockchain and DLTs when it comes to data privacy. Appropriate data storage strategies need to be adopted by the developers, operators and participants of the blockchain consortiums, in order to ensure compliance with applicable privacy and data protection regulations. The Government and the Draft Strategy could consider providing guidance under the Unified Blockchain Framework, as to what Personal Data should or should not be stored on the chain.Moreover, the final strategy of the MeitY should be informed by and harmonised with requirements around data protection that are soon to be finalised under the ambit of the Personal Data Protection Bill, 2019.

V. Appropriate Legislative Changes for Enabling Wider Adoption

The Draft Strategy should also identify any existing legislative and regulatory bottlenecks that could inhibit the adoption of blockchain and DLT for critical use cases.The MeitY should work to identify similar bottlenecks (if any) to the adoption of blockchain, and suggest appropriate legislative changes required.There are already several helpful proposals in the Draft Strategy relating to regulatory sandboxes for DLT, which could serve as a useful starting point for gathering learnings to inform legislative amendments.

VI. Platform for Lesson Sharing

The Draft Strategy should provide a platform for lesson sharing between the industry, Government and larger DLT and blockchain development ecosystem. In particular, the recommendations of the Draft Strategy should (wherever possible) be linked to lessons and experiences drawn from existing pilots.This could be implemented in the form of a forum for regulators and Government departments to share and exchange views and best practices for enabling/ evaluating blockchain developments. Experts/ Practitioners should be invited to such forum for sharing latest developments. These could be sector focussed.

VII. Role of Universities in Skilling and R&D

Lastly, we appreciate the Draft Strategys emphasis on ecosystem development and capacity building. However, there is a scope to further clarify the role of other ecosystem participants, particularly, universities and research organisations. The Draft Strategy could consider identifying a few areas for the Government to invest strategically in R&D surrounding DLT/blockchain.

[A copy of the detailed submission is attached with this blog]

In case of any further clarifications, please write to komal@nasscom.in/indrajeet@nasscom.in .

The post NASSCOM Feedback on the Draft National Blockchain Strategy appeared first on NASSCOM Community |The Official Community of Indian IT Industry.

Go here to read the rest:
NASSCOM Feedback on the Draft National Blockchain Strategy - Web Hosting | Cloud Computing | Datacenter | Domain News - Daily Host News

Read More..

The cloud without the wait: mobile edge computing and 5G – Verizon Communications

It all starts with the cloud

The cloud stores your data, all your pictures and your phone contacts, and it processes information that helps make your favorite apps work. Cloud computing can do several things at once, really well: It can compute, store data and work with the network, all in one location. Many cloud providers, for example, have storage facilities that do cloud computing in locations all over the world. When you take a photo with your phone and send it to Instagram, it goes to a cloud facilitypossibly several hops and four or five states awaywhere all the necessary computing takes place, and then it publishes to Instagram. Its a similar process for reading your morning email or listening to a podcast. For things like that, the centralized cloud works really well, and the latency is low enough that your experience is just fine.

But certain experiences require a lot of data to move very quickly to and from a device and the cloud. Thats where MEC comes in. It brings the cloud closer to you.

The edge refers to the part of Verizons network that is closest to you: Your device connects to the network at the edge. And edge computing means bringing the cloud to the edge of the network closest to your device.

So how do you make edge computing more mobile, and closer to the devices that need it?

MEC is an entire network architecture that brings computing power close to any device thats using it. Instead of data going back and forth to cloud servers four or five states away, its processed just miles or meters from the device. For this purpose, Verizon has installed cloud servers in its own access points across its networks.

Read more:
The cloud without the wait: mobile edge computing and 5G - Verizon Communications

Read More..

Red Hat supports high-availability apps in AWS and Azure Blocks and Files – Blocks and Files

Red Hat Linux running in the AWS and Azure public clouds now supports high-availability and clustered applications with its Resilient Storage Add-On (RSAO) software. This means apps like SAS, TIBCO MQ, IBM Websphere MQ, and Red Hat AMQ can all run on Red Hat Linux in AWS and Azure for the first time.

Announcing the update in a company blog post, Red Hat Enterprise Linux product manager Bob Handlin wrote: This moment provides new opportunities to safely run clustered applications on cloud servers that, until recently, would have needed to run in your data centre. This is a big change.

AWS and Azure did not support shared block storage devices in their clouds until recently. One and only one virtual machine instance, such as EC2 in AWS, could access an Elastic Block Storage (EBS) device at a time. That meant high-availability applications, which guard against server (node) failure by failing over to a second node which can access the same storage device, were not supported.

Typically, enterprise high-availability applications such as IBM WebSphere MQ have servers accessing a SAN to provide the shared storage. These applications could not be moved to the public cloud without having shared block storage there.

Azure announced shared block storage with an Azure shared disks feature in July 2020. And AWS announced support for clustered applications using shared (multi-attach) EBS volumes in January this year. The company said customers could now lift-and-shift their existing on-premises SAN architecture to AWS and Azure without refactoring cluster-aware file systems such as RSAO or Oracle Cluster File System (OCFS2).

Red Hats Resilient Storage Add-On lets virtual machines access the same storage device from each server in a group through Global File System 2 (GFS2). This has no single point of failure and supports a shared namespace and full cluster coherency which enables concurrent access, and cluster-wide locking to arbitrate storage access. RSAO also features a POSIX-compliant file system across 16 nodes, and Clustered Samba or Common Internet File System for Windows environments.

AWS and Azure s shared block storage developments have enabled Red Hat to port RSAO software to their environments. RSAO uses the GFS2 clustered filesystem and it passes Fibre Channel LUN or iSCSI SAN data IO requests to either an AWS shared EBS volume or Azure shared disk as appropriate.

Handlin said Red Hat will test RSAO on the Alibaba Cloud and likely other cloud offerings as they announce shared block devices.

Go here to read the rest:
Red Hat supports high-availability apps in AWS and Azure Blocks and Files - Blocks and Files

Read More..