Page 2,719«..1020..2,7182,7192,7202,721..2,7302,740..»

Benefits of Edge Computing in the Internet of Things – BBN Times

As organizations continue to push for the integration of technologies and information sources to streamline their functioning and coordination, the need for decentralizing is also gaining importance as a way to enhance the responsiveness of these systems.

The past few years have seen IoT rise from being just another futuristic buzzword to a tool having great utility and business value in the present. The internet is rife withreal-life examples IoT applications; and from what weve seen and what we know about the technology, such cases are just scratching the surface of potential IoT applications. But just like any other form of technology, IoT comes with its own set of obstacles, or at least, areas that can be further enhanced. For starters, as IoT networks spread across and cover wide areas by incorporating a growing multitude of devices, the sheer volume of the data collected will require heavily resource-intensive processing devices and high-capacity data centers. Similar problems arising due to the centralized, integrated nature of the technology can be eliminated by distributing the control and processing power of the IoT network towards the edge the points where data is actually gathered, and where action, or rather reaction, is generally required.

Edge computing refers to the installation and use of computational and storage capabilities closer to the edge, i.e., the endpoints where the data is gathered or where an immediate response is required. IoT systems can be comprised of a large number and multiple types of endpoints connected to a centralized, often remotely located data centers. These endpoints include, but are not limited to:

Edge computing in IoT implies having autonomous systems of devices at these endpoints (or the edge) that simultaneously gather information and respond to the information without having to communicate with a remotely constructed data center. Instead of having remote data centers and computational servers, the processing of data can be done right where the data is collected, eliminating the need for constant connectivity to centralized control systems and the problems inherently associated with such setups.

For instance, a software company that sells cloud-based mobile applications can have cloud servers based in multiple locations closer to users instead of in a single location that may lead to undesirable latency and a single point of failure in case of any mishap. If the centralized servers failed due to some reason, all application users would lose their data and access to services at once. Additionally, the servers would also have to deal with heavy traffic, causing latency and inefficiency. On the contrary, a decentralized system would ensure that all the data pertinent to specific users would be hosted in the closest data center them among multiple ones, minimizing latency and limiting the impact of any potential failure. In addition to solving inherent IoT problems, the incorporation of edge computing into IoT is increasingly being seen as a necessity as it enhances the network in terms of functionality as well as performance.

Organizations using edge computing to power their IoT systems can minimize the latency of their network, i.e., they can minimize the time for response between client and server devices. Since the data centers are closer to the endpoints, there is no need for data to travel to and from the distant centralized systems. And as the edge storage and control systems are only required to handle the data from the few endpoints they are linked to, bandwidth issues seldom slow down the flow of data. Since IoT systems require high-speed information transfer to function with maximum efficacy, edge computing can significantly boost organizational performance.

Another benefit of decentralizing IoT with edge computing is providing data security. A centralized data repository is prone to hacks that aim to destroy, steal, or leak sensitive data, and such attacks can lead to the wholesale loss of valuable data. Conversely, distributing critical data across the IoT network and sitting it on edge devices can limit the loss of data. Additionally, it can also help in compliance with data privacy roles such as the GDPR since data is only stored in devices or subsystems that would use that data. For instance, a multinational corporation can use edge devices to store customer data on local devices that are closer to where the customers are instead of having it in an overseas repository. The data needn't be stored in locations where irrelevant personnel can have access to it.

Cloud costs will also be minimized as most data will be on the edge devices, instead of in centralized cloud servers. Additionally, the cost of maintaining high-capacity, long distance networks will be reduced as bandwidth requirements continue to diminish.

It is easy to see now why any discussion on IoT should always include the exploration of edge computing as a key enabler. Edge computing, more than a technology, is a design framework of sorts that would redefine the way IoT systems are built and the way they function. Although the combination of other solutions will also be needed to expedite the widespread adoption of IoT, edge computing might just prove to be the chief catalyst in the process.

The existing applications of IoT are already providing us with the evidence for a densely interconnected future, where every device will be able to communicate with every other device, creating an intricate web of information in and around our daily lives. These devices will be able to incessantly gather information through a myriad of sensors, process information through complex algorithms running on centralized servers, and effect changes using actuating endpoints. From agriculture to manufacturing and healthcare to entertainment, every industry is set to see massive transformation driven by IoT.

Although the ability of IoT systems to execute and initiate responsive action will be transformational enough, the real revolution, as it were, would be brought about by the essentially limitless cornucopia of data that will be generated due to the unbridled proliferation of sensors and other data gathering IoT endpoints. In fact, thisIoT data will prove to be the real wealth for the businessesusing the technology, as structured data in unprecedented quantities can be captured and analyzed to gain deeper insights into the market and also into organizations and business processes. The increased volume of data gathered will enable businesses to take even more effective action, driving operational excellence. However, gathering and processing such vast amounts of data would require high-capacity storage, communication, and computational infrastructure. Even though advances in communications technology such as the mainstream adoption of5G can catalyze IoT innovationand implementation, newer ways of making IoT more effective and efficient are still required. And one of the most promising solutions for enabling IoT to realize its potential is edge computing.

Visit link:
Benefits of Edge Computing in the Internet of Things - BBN Times

Read More..

Managed Infrastructure Services Market 2021- 2026 Top Trends, Business Opportunity, and Growth Strategy-AmcorLtd, ResiluxNV The Manomet Current – The…

Managed Infrastructure Services Market with COVID-19 Impact by Component, Application, Services, and Region- Forecast to 2026

The Global Managed Infrastructure Services Market Research Report 2021-2026 is a significant source of keen information for business specialists. It furnishes the business outline with development investigation and historical and futuristic cost analysis, income, demand, and supply information (upcoming identifiers). The Managed Infrastructure Services market research analysts give a detailed depiction of the Market worth chain and Managed Infrastructure Services wholesaler examination. The Managed Infrastructure Services Market study gives extensive information which upgrades the agreement, degree, and use of this report. This is a latest report, covering the COVID-19impact on the Managed Infrastructure Services market.

The global managed infrastructure services market was valued at USD 80.45 billion in 2020, and it is expected to reach USD 143.23 billion by 2026, registering a CAGR of 9.95%, during the period of 2021-2026.

(Exclusive Offer: Flat 30% discount on this report)

Get a free Sample Copy of this Report-

https://www.marketinsightsreports.com/reports/02082591570/managed-infrastructure-services-market-growth-trends-covid-19-impact-and-forecasts-2021-2026/inquiry?Mode=Ak*kshata

Top LeadingCompaniesof Emergency Location Transmitter Market areUbisense Group PLC, AeroScout, Inc., TeleTracking Technologies, Inc., Savi Technology, Zebra Technologies, CenTrak Healthcare company, Ekahau, Inc., Midmark Corporation, Identec Group AG, Sonoitor Technologies, Inc., Awarepoint Corporation (Centrak Inc.), Kontact.io, Alien Technology, Stanley Healthcare , Impinj and others.

Industry News And Updates-

Sep 2019 South Slope, a rural independent telecommunications cooperative, deployed the Cisco NCS5500 series platform to meet its growing capacity demands, and using its existing network to deliver data, voice, and video services, along with a variety of business ethernet and cellular backhaul services to its customers throughout eastern Lowa. To facilitate the transition, increase the power of the underlying transport network, and reduce operational complexity, Cisco and South Slope created a converged solution based on the Cisco NCS5500 router series. With this new solution, South Slope was able to utilize integrated multiplexing optics in the routing platform to exceed 200G wavelengths over the existing packet core.

January 2019 IBM Services signed a USD 540 million multi-year managed services agreement with Nordea Bank, a financial services company based in Sweden. Nordea may outsource its IBM Z operations to the company. The agreement reportedly outsources a majority of IBM Z infrastructure services in five countries, where Nordea operates. The deals also allow Nordea to have continued access to IBMs latest technology advancements, including the cognitive services while maintaining a sustainable IBM Z organization at the same time.

Key Market Trends-

The advent of cloud-deployment has brought changes in the managed infrastructure services providers (MISP) space and made them embrace a delivery model for delivering technology services over public or private cloud. Considering the advantages the cloud offers, businesses are seeking MISPs that have partnerships with cloud providers (such as Google, AWS, Microsoft, etc.), to choose the right cloud providers, migrate to the cloud, and manage cloud services after the transition.

With the increasing demand from enterprises, various companies, such as HC (Host Color), in 2019, launched a managed cloud infrastructure service. The managed services are available with public cloud servers, hybrid cloud, and hosted private cloud, where managed clouds infrastructure include installation of Linux or Windows-based operating system, regular maintenance and updates of software programs and applications.

Recent technology trends, such as enhanced cloud infrastructure, IoT enabled ecosystem have provided opportunities in creating new business imperatives across the US IT sector, and the penetration of public cloud in United States is predicted to be higher in 2020. Additionally, Fujitsu has been recognized by Amazon Web Services (AWS) as an official AWS managed infrastructure provider partner, thereby validating the companys capabilities in accelerating cloud transformation, and helps fast-track digital transformation and accelerate innovation for enterprises and government. Such instances are expected to fuel the demand of the market across the United States during the forecast period.

Inquire for Discount:

https://www.marketinsightsreports.com/reports/02082591570/managed-infrastructure-services-market-growth-trends-covid-19-impact-and-forecasts-2021-2026/discount?Mode=Ak*kshata

Geographic Coverage-

The report contains detailed country-level analysis, market revenue, market value and forecast analysis of ingestion, revenue and Emergency Location Transmitter Market share,growth speed, historical and forecast (2016-2026) of these regions are covered:

Browse Full Report at: (Avail a Free Consulting For your Business)

https://www.marketinsightsreports.com/reports/02082591570/managed-infrastructure-services-market-growth-trends-covid-19-impact-and-forecasts-2021-2026?Mode=Ak*kshata

Reasons for purchasing this Report-

Finally, Emergency Location Transmitter Market report is the credible hotspot for acquiring statistical market that will exponentially grow your business. Emergency Location Transmitter Market report furthermore Presents another SWOT assessment, theory feasibility examination, and venture return investigation.

We additionally offer customization on reports based on customer necessity:

1- Free Country-level analysis for any 5 countries of your choice.

2- Free Competitive analysis of any 5 key market players.

3- Free 40 analyst hours to cover any other data points.

Please connect with our sales team (sales@marketinsightsreports.com).

All the reports that we list have been tracking the impact of COVID-19 on the market. Both upstream and downstream of the entire supply chain has been accounted for while doing this. Also, where possible, we will provide an additional COVID-19 update supplement/report to the report in Q3, please check with the sales team.

About Us:

MarketInsightsReports provides syndicated market research on industry verticals including Healthcare, Information, and Communication Technology (ICT), Technology and Media, Chemicals, Materials, Energy, Heavy Industry, etc. MarketInsightsReports provides global and regional market intelligence coverage, a 360-degree market view which includes statistical forecasts, competitive landscape, detailed segmentation, key trends, and strategic recommendations.

Contact Us:

Irfan Tamboli (Head of Sales) Market Insights Reports

Phone: + 1704 266 3234 | +91-750-707-8687

sales@marketinsightsreports.com | irfan@marketinsightsreports.com

Read the original:
Managed Infrastructure Services Market 2021- 2026 Top Trends, Business Opportunity, and Growth Strategy-AmcorLtd, ResiluxNV The Manomet Current - The...

Read More..

Massive Demand for High-End Enterprise Servers Market is increasing with the growth in cloud computing solutions and services Industry FLA News – FLA…

A server is a hardware system that operates on suitable software and provides network services by operating across the computer network. A server can run on an individual computer called the server computer or on a network containing numerous interconnected computers.

Increase in demand for x86 high-end servers in recent years is one of the major driving factors for the global high-end enterprise servers market. Demand for high-end enterprise servers is increasing with the growth in cloud computing solutions and services market. Enterprises providing cloud computing services are adopting high-end enterprises servers to meet the speed and uptime demand for their services. The emerging big data trend is also driving the adoption of high-end enterprise servers across major industry verticals. Big data analytics is being used by all major companies across the industry verticals and it requires servers with high processing capabilities. Furthermore, the use of analytics and big data processing software is increasing across industry verticals such as hospitals, retail and Banking, Financial services and Institutions (BFSI), which is driving the growth of high-end enterprise servers market. However, high initial and installation costs associated with the use of high-end enterprise servers is one of the major challenges restraining the wide adoption of high-end servers. Furthermore, high level of technical skills is required for the installation and maintenance of high-end enterprise servers.

Request for a sample:

https://www.transparencymarketresearch.com/sample/sample.php?flag=S&rep_id=4060

Global high-end enterprises servers market is segmented on the basis of operating system used, chip type, operating systems bits and geography. On the basis of operating system used, the market for high-end enterprises servers market is segmented into Linux operating system, windows server operating systems, IBM I and UNIX operating system. Further, on the basis of chip type, the high-end enterprise servers market is segmented into two major types namely complex instruction set computing (CISC) and reduced instruction set computing (RISC). 32 bit enterprise servers and 64 bit enterprise servers are the two major types of high-end enterprise servers on the basis of operating system bits. High-end server solutions are used by various industry verticals. Thus, on the basis of industry verticals the market for high-end enterprises servers is segmented into Banking, Financial Services, And Insurance (BFSI) sector, telecom and IT, media and entertainment, retail sector, manufacturing industry, healthcare industry and other industry verticals. North America is the major geographical segment for high-end enterprise servers market in terms of revenue, followed by European region. The United States, Japan, France, and Germany are some of the major countries driving the growth of high-end enterprise servers market in North American and European region.

Ask for brochure:

https://www.transparencymarketresearch.com/sample/sample.php?flag=B&rep_id=4060

Apple, Inc., Aspera, Inc., CCS Infotech Limited, Cisco Systems, Inc., Dell, Inc., Appro International, Inc., Fujitsu Computer Systems Corporation, ASUSTeK Computer, Inc., Fujitsu Siemens Computers, Acer, Inc., Borland Software Corporation, Unisys Corporation, Groupe Bull, HCL Infosystems Ltd., Hewlett-Packard Company, Hitachi, Ltd., IBM Corporation, Lenovo Group Limited, NCR Corporation, NEC Corporation, Silicon Graphics, Inc., Sun Microsystems, Inc., Toshiba Corporation, Super Micro Computer, Inc., Uniwide Technologies, Inc., and Wipro Infotech are some of the major vendors in the global high-end enterprises servers market.

Read Our Latest Press Release:

https://www.prnewswire.com/news-releases/beyond-customary-energy-and-cost-saving-advantages-both-hard-and-soft-benefits-extend-immense-growth-opportunities-in-building-analytics-market-growth-trajectory-projected-at-stellar-15-cagr-from-20212031tmr-301288910.html

Go here to read the rest:
Massive Demand for High-End Enterprise Servers Market is increasing with the growth in cloud computing solutions and services Industry FLA News - FLA...

Read More..

Cloud storage that encrypts your files so they’re safe Explica .co – Explica

There was a time when the free triumphed on the internet. Being strict, this is still the case. But there are those who wonder to what extent it is worth using free services in exchange for their data, our data, being used to advertising campaigns or to sell us something. Cloud storage is a sector that offers free and paid space to save files. And it is increasingly common that, among their functions, they highlight the encryption of your files to keep them safe.

I mean, its not enough anymore have space to save your files, share them and have them on all your connected devices. Now we also need the files we upload to the cloud to be safe. And for this, in addition to encrypting connections between your devices and the servers of the cloud storage service, it is increasingly common to find services that encrypt your files so that they are inaccessible by third parties or even by those who keep them online.

Thanks to file encryption, cloud storage gets rid of that old mistrust that if your documents are on an external server, they are visible to other people. Now it is no longer the case. The content you upload to the cloud stay safe both during shipping and after storage, wherever you are.

From the creators of NordVPN, a popular VPN service, comes to us NordLocker, your cloud storage that bets on the end-to-end encryption. Its free version gives us 3 GB of space, which you can expand to 500 GB for $ 3.99.

With its own applications for Windows and macOS, which integrate with Finder and Windows Explorer, this service makes it easy to encrypt files, share them through public links, make automatic backup, etc. As a particularity, its encryption is applied to your files since they are on your device. From there, you can upload the content of your choice to their servers.

For the rest, as encryption algorithms it uses AES256, Argon2 and ECC. And with the local encryption feature, you will keep your files safe even before uploading them to the cloud.

Secure cloud storage. Based in Canada, Sync He gives us 5 GB with his free account, space that can be increased by subscription. Whichever option is chosen, your files will be stored using end-to-end encryption so they stay safe and private.

The cheapest payment account, $ 8 per month with annual payment, increases your online space up to 2 TB. Not bad for making backup copies and saving countless photos, videos and all kinds of documents. In this sense, there are no limits to share and move files from your devices to the cloud.

As security measures, in addition to end-to-end encryption, it protects us from third-party tracking and complies with security and privacy regulations. HIPAA, GDPR and PIPEDA. To this must be added the two factor authentication, restricted downloads on demand, password protection and a built-in password file recovery with a minimum of 180 days in its cheapest version.

Save your files in complete privacy. This is how it is presented Internxt, a provider of services related to cloud storage. What we understand as an alternative to Dropbox or Google Drive is called Internxt Drive. For free, it gives us 10 GB to upload files.

As explained on their website, the files are encrypted during the process rise and are distributed in small packages. Only you have the digital key to gather those pieces and re-access the original file.

Internxt It is available for any device with an app or through the web. PC, Mac, iOS, Android And as an incentive, it is a project based in Valencia, Spain.

From Switzerland comes to us Tresorit, a cloud storage service that is committed to security as a flag. Available for companies and individuals, the cheapest plan has a price of 10 per month ( 8.33 if you pay annually). In return you will get 500 GB of cloud storage in which content remains encrypted.

Another interesting detail is that you can upload to your cloud files up to 5 GB, according to the most economical plan. That limit goes up to 20 GB with the most ambitious professional plan. Otherwise, the service has the same advantages as Dropbox or Google Drive. Namely: access from any device, ease of use, share files with a public link

Tresorit It has web access and mobile applications. Also, you can integrate it into Outlook or Gmail in order to avoid storage problems with your emails. And to give you peace of mind, it complies with security and data protection standards ISO, CCPA and HIPPA, among others.

See the original post:
Cloud storage that encrypts your files so they're safe Explica .co - Explica

Read More..

The business benefits of using an open-source cloud – TechCentral

Even before the pandemic led to increased remote work migration, many organisations were becoming increasingly reliant on cloud solutions to streamline systems and workflow. But as with any enterprise technology, implementing cloud solutions comes with questions about the best way for individual businesses to harness their benefits.

By now, were realising that using a single cloud vendor can lead to limitation and that a flexible, multi-vendor strategy is better for innovation. Although using a variety of cloud environments gives businesses the ability to adapt to changing business requirements, it also requires integration. Open source gives organisations an answer to this: It offers unmatched flexibility while also cutting the costs of software acquisition.

Open-source vs proprietary software conversations may lead one to believe that open source is the exception rather than the rule, but Linux, the open-source operating system that revolutionised data centre operations, enables almost all of the major public clouds being used today. It continues to power new cloud-native technologies.

Kubernetes, an open-source container orchestration platform, has also become the industry standard for managing cloud-native workloads. It automates the deployment, scaling and management of application containers, and allows you to move workloads effortlessly between on-premise, private or public cloud infrastructure.

Its clear that open-source cloud shouldnt be treated as some kind of strange new tech. But what exactly are the business benefits of using an open-source cloud in a hybrid environment?

The public cloud is the best way for organisations to access IT resources that can easily be increased or decreased as needed, offering flexibility, scalability and cost savings (if used correctly). Internal private clouds, with their on-premises servers, give companies some of the benefits of the cloud with added security and without having to sacrifice control of their environment. To take advantage of the best of both the public and private cloud, many businesses implement hybrid cloud environments.

Historically, organisations have managed their public and private clouds separately, but an open-source hybrid cloud approach allows them to integrate these different environments into a single, comprehensive platform. This means on-premises services can have the same agility, functionalities and seamless experiences of a versatile public cloud.

The author, Danie Thom, says going the open-source route allows companies to gain independence and create custom solutions where they most need them

If businesses want the same flexibility from their on-premises data centres that they experience with the public cloud, they can no longer manage them in traditional and siloed ways. You must be prepared to adapt your technology, people and processes to gain any advantage. When computing demands fluctuate, businesses should be able to divert their workloads in a way that is both optimal for performance and usage costs. Open-source technology and methodologies enable this, and they mean that a business is less likely to be constrained by the functionalities of its cloud solution.

Compared to proprietary solutions that are rarely cross-compatible, open-source cloud infrastructure is also designed for interoperability, allowing different apps, servers or containers to work in harmony on different public cloud providers platforms. You could even duplicate your infrastructure from one cloud to another without a significant amount of modification, meaning less time wasted and increased productivity.

With the convergence of virtualisation (running multiple virtual machines on a single server) and containerisation (ways of running multiple applications on a single virtual machine) helping businesses run more by using fewer resources, open-source hybrid cloud approaches become even more beneficial. An open-source hybrid cloud platform allows you to containerise apps into their individual functions and develop and manage them all in one place regardless of what platform they come from. Theres no need to worry about the underlying tech: Open source gives you a portable, stable and secure way of running your applications. Developers also become more productive and operations more efficient.

Proprietary cloud solutions create vendor lock-in, often limiting businesses to standardised solutions that can result in walled software gardens and dependency on one providers suite of products or services. This limits a businesss options when they need added functionality, platform integration or simply want to change cloud providers. With an open-source cloud, youre never tied to functionalities or one particular platform, instead giving you the ability to choose from whichever cloud services best suit your needs.

Businesses that use open-source cloud services can also integrate them into one cohesive ecosystem and customise them to their specific needs. This allows them to use a vast ecosystem of technology and services creating a digital platform that fits their unique requirements, with secure, automated application runtimes. Rich integration, business process automation and automatic decisioning can be used to create immersive customer engagement anywhere.

We shouldnt be thinking about the cloud as the location of workloads and resources, but rather how we run them. The key to unlocking the power of the cloud is to treat it as an ecosystem, fostering interconnectedness, openness and standardisation across cloud architectures. An open-source cloud strategy is made for this as it increases code quality, flexibility and the availability of features, improves visibility over every layer of infrastructure, and gives businesses the ability to move between platforms.

Because of its interoperability, going the open-source route also doesnt have to mean businesses need to move away from their existing proprietary cloud architecture. Rather, it allows them to gain independence and create custom solutions where they most need them. With collaboration and openness being the future of software and the future of cloud, if businesses want to remain both integrated and innovative, an open-source cloud strategy is essential.

Simplify your companys digital transformation with Red Hats checklist for a successful hybrid cloud strategy. Find out more.

Read the original:
The business benefits of using an open-source cloud - TechCentral

Read More..

S. Korean tech companies to strengthen cooperation for AI server chips – The Korea Herald

(123rf)

The Ministry of Science and ICT said five data center companies -- Naver Cloud, Douzone Bizon, Kakao Enterprise, NHN and KT -- and the Artificial Intelligence Industry Cluster Agency signed a memorandum of understanding with local server chip companies SK Telecom, Rebellions, FuriosaAI, and the Electronics and Telecommunications Research Institute to expand the use of locally developed artificial intelligence (AI) semiconductors in data centers.

AI semiconductors have recently grown in demand from data center operators, which require the chips to efficiently process copious amounts of data.

The ICT ministry expects the global market for AI chips used in servers to reach $34.7 billion by 2030 from $3.5 billion last year.

Under the latest agreement, the companies will also cooperate in developing the chips, as well as pursue establishing a semiconductor testbed at an AI industrial complex in Gwangju, 329 kilometers south of Seoul.

The ministry said it will support the companies' move and that it hopes it will strengthen the country's semiconductor industry amid the recent global chip shortage.

SK Telecom, South Korea's leading wireless carrier, launched its AI chip for data center operations, the SAPEON X220, in November last year.

Separately, FusiosaAI, a startup that aims to launch its first semiconductor in the third quarter this year, drew 80 billion won ($70.6 million) in additional funding Tuesday from investors, including Naver. (Yonhap)

Read the original here:
S. Korean tech companies to strengthen cooperation for AI server chips - The Korea Herald

Read More..

An introduction to AWS IAM best practices – TechTarget

IT teams need to ensure that only known and trusted users can access their organization's vital applications and data.

Cloud users rely on services, like AWS Identity and Access Management (IAM), to secure and manage access across the vast portfolio of AWS services and resources -- and even federate a level of access control between AWS and local data center resources.

Let's take a closer look at AWS IAM, learn how it works and review best practices to help use resources securely.

AWS IAM is an Amazon cloud offering that manages access to compute, storage and other application services in the cloud. IAM's primary capability is access and permissions. It provides two essential functions that work together to establish basic security for enterprise resources:

IAM deals with four principle entities: users, groups, roles and policies. These entities detail who a user is and what that user is allowed to do within the environment:

IAM is fully interoperable with most compute, container, storage, database and other AWS cloud offerings. However, IAM is not fully compatible with all offerings on the platform, so it is best to check compatibility before implementing the service. For example, Amazon Elastic Compute Cloud (EC2) does not fully support resource-level permissions or authorization based on tags.

IT teams can manage and share a single business account between many different users -- each using unique credentials. Administrators can create policies to establish granular permissions and grant users access to different resources depending on their identity. Changes to IAM, such as creating or updating users, groups, roles and policies, take time because changes must be replicated to multiple servers globally. This means changes to IAM should not be critical or time dependent.

The common IAM process breaks down into four distinct phases:

IT teams can access AWS IAM four ways: AWS Management Console, AWS Command Line Interface (CLI), SDKs and APIs. Each technique is used for different purposes, but the underlying IAM service is the same. IT pros use the AWS Management Console or AWS CLI to make requests that are processed through IAM, while applications use the SDK or API.

IAM is essential to cloud security, but it also poses some complexity for inexperienced cloud administrators. Here are some best practices to enhance IAM effectiveness and help avoid common security mistakes.

Never use root credentials. A business might create a single AWS account with root credentials and then establish many different users and roles with other credentials. The root account should always be the most protected and secure entity within an AWS environment. Never use or share root credentials under any circumstances -- even for administrative activities.

Use groups for IAM policies. While it is possible and sometimes necessary to apply policies to individual users, it's better to apply group policies instead. For example, rather than managing policies for 10 individual HR staff members, put them into an HR group and apply a single HR policy to the entire group. This is faster and causes fewer oversights that compromise security. Groups also make it easier to move users as their jobs change.

Apply conditions to IAM policies. AWS users can apply conditions to policies that place additional stipulations on resource access. Conditions could include date and time limitations, IP source address ranges and require Secure Sockets Layer encryption. For example, conditions may specify that users must authenticate with MFA before they are allowed to terminate an EC2 instance. Conditions are not always necessary, but they add another layer of security for sensitive requests.

Use least privilege in IAM. The principle of least privilege gives users only the minimum access rights to do their job, and no more. Users and groups should be given only the minimum rights to accomplish necessary tasks.

Use MFA for better security. IAM supports multifactor authentication, which requires an additional credential based on a physical item that the user possesses. While MFA may not be appropriate for all cloud users, it is a useful addition for high-security users such as cloud administrators and senior business staff.

Use strong passwords. IAM allows cloud administrators to implement a custom password policy that can force stronger password selection -- such as longer strings with mixes of case, numerals and symbols -- and require regular password changes. Stronger passwords are more difficult to crack through systematic attempts and enhanced cloud security.

Use unique access keys. Access keys are used as credentials for applications. Keys act as the password for applications. Encrypt all keys that are embedded in an application and never use the same key for more than one application. It may be safer and more effective to set up an application to receive temporary credentials using IAM roles rather than using access keys.

Remove outdated IAM credentials. Locate and remove IAM passwords and keys that are idle to increase security. Principals that no longer use IAM, such as users that left the company or deprecated applications, no longer need credentials. Remove those credentials to prevent the principals from accessing the environment in the future.

Review IAM policies and permissions regularly. Business and security needs change over time. Establishing and applying policies is just a start. Review and update policies on a regular basis to ensure that the organization's security posture meets business and compliance demands. If a group no longer needs a specific resource, remove that resource from the group policy to prevent unwarranted access.

Monitor the AWS account. Log files are a primary source of security information that yield details about user access, actions, outcomes and resource status. AWS provides logging features in multiple AWS services, including Amazon CloudFront, AWS CloudTrail, Amazon CloudWatch, AWS Config and Amazon Simple Storage Service. Cloud administrators should take advantage of every relevant log service to validate and maintain security in the AWS cloud.

Read more here:
An introduction to AWS IAM best practices - TechTarget

Read More..

Modern Clouds Transition Of Storage Solutions From File To Block To Object – Influencive

Remember when data was just a piece of information stored in a file on your computer? Occasionally, you or your employees would visit the same file to access the information and then get back to work. It was so simple, right?

Today, times have changed. Technology has evolved and so has the means of saving data. The age of data being stored locally on devices is in the past.

As the world steps into the age of digital transformation, we have innovated a new, modern solution for data storage.

To understand what we have now, we must first have an idea of what we had decades ago.

Data storage can have three different types, the first one being the file storage system. It is as simple as it sounds. All you need to do is give the content a file name, add metadata, and save it within subdirectories of directories.

The naming conventions are used to make it easier to trace the file back to the system and use it as required. What makes this type of storage system incompetent is its inability to provide ubiquitous access to data. Employees working remotely at distant locations do not have access to locally stored data.

IT administrators working over the system can easily specify the shortcomings of the system. Even though this is a type of storage that has been part of the industry since its infancy, leaders today want a centralized data storage solution for flexibility and accessibility. This is how the idea of a block storage structure came into the picture.

Instead of storing data as files, organizations shifted to a block storage system where the data would be divided into smaller chunks and then stored in blocks.

Most companies use centralized data servers hosted by third-party organizations to save their enterprise data.

Opting for providers helps to outsource the job of managing and storing data to third-party enterprises, freeing the organization from managing the tedious job themselves.

The operating system decides which data goes to which block, eliminating the need to add any metadata to the block. This is another way of storing files but it has its limitations.

Firstly, the storage is tied only to a single server at a time. Secondly, you are expected to pay for the block even if you arent using it.

This brings us to the third and the preferred form of storage, object-based storage.

As the name suggests, data is stored in isolated containers which are called objects. Each object has an identifier which makes accessing it from a pool of data simpler and faster. Furthermore, these objects can be saved either in the local storage space or in a remote server that is miles away from the organization.

With an object-based data storage system, you can add flexibility as well as scalability to the entire system. It keeps pace with the growth of data and, surprisingly, you only pay when you use the model. This is one of the reasons why object-based storage has gained tremendous attention.

Expanding on that, allow us to highlight the cloud-based solution, Tebi. Decentralization has been the need-of-the-hour as the limitations of the centralized solution have made themselves apparent. Tebi being a geo-distributed data centre facilitates ease of storage as well as ease of access.

Instead of having all your organizational data stored in a single location, the data is distributed across geographical locations and corresponding data centres.

This assures that your data is accessible by users across the globe without complications. It extracts the concept of object storage and applies it to the cluster of networks.

It is one of the leading solutions in the field of storage today and irrespective of what your organization is and how much data you produce, it will help you improve the scalability, reliability, and accessibility on a global level. Read more at bhtnews.com.

Published May 30th, 2021

See the article here:
Modern Clouds Transition Of Storage Solutions From File To Block To Object - Influencive

Read More..

Bitcoin is greener than many — including Elon Musk — think it is – MarketWatch

Its been three weeks since Tesla CEO Elon Musk tweeted that the electric-car company had dropped bitcoin as a payment option, citing concerns over the cryptocurrencys link to greater consumption of fossil fuels.

Since May 12, bitcoin BTCUSD, -0.86% has plunged by about a third, dragged down, in part, by criticism over its carbon footprint.

But the issue is not so simple.

Today Im joined by Alexander Benfield, a cryptocurrency analyst at Weiss Ratings. Instead of focusing on overall market dynamics, well talk about bitcoin and issues surrounding its energy consumption during the mining process.

The cryptocurrencys network relies on computers solving puzzles, which uses electricity. Annual power consumption of bitcoin mining is about 130 terawatt-hours, according to the University of Cambridge. To put that in perspective, the U.S. uses almost 4,000 terawatt-hours of electricity a year.

MarketWatch: A claim that bitcoin is an energy hog has been around for a while. How much merit is there to such a claim?

Benfield: That is a question with a multi-faceted answer. Yes, bitcoin does consume a lot of energy, but that does not necessarily translate into carbon emissions. Much of bitcoin mining uses renewable energy; depending on the source, that number ranges between 39%-73%, which is far higher than the percentage of renewable energy in the U.S. power grid. So even going by the low estimates, bitcoin is far more energy-conscious than the average industry. Additionally, a considerable amount of bitcoin mining actually uses excess energy that would otherwise be wasted in areas where it cant be exported to a nearby city infrastructure. For example, bitcoin miners in rural China use hydro-electric energy that would otherwise be wasted due to low local energy demand and the inability to transport that excess energy to an urban power grid.

MarketWatch: Some analysts say bitcoin is actually greener than many people think. What do they mean by that?

Benfield: Nic Carter has done some amazing research into this topic and is constantly trying to prove this point on television. (Carter is a general partner at Castle Island Ventures, a Cambridge, Mass.-based venture firm.) However, many critics dont care to listen. Cathie Wood recently took to Bloomberg to talk about potential ways of incorporating bitcoin mining into renewable energy providers power grids to capitalize on the intermittent periods when their excess energy is currently wasted. (Wood is CEO of active-ETF manager ARK Invest.) So perhaps bitcoin can actually help take advantage of much more wasted energy than was previously thought.

MarketWatch: So far, we have established that bitcoin is somewhat energy hungry. What is the purpose of all that energy expenditure?

Benfield: Bitcoins energy usage makes bitcoin more secure. The cost of attacking bitcoin rises along with the increase in the computational power and the energy consumed by those mining or securing the network.

MarketWatch: We hear a lot about the advent of cryptocurrencies that spend less energy than bitcoin does. What can you tell us about them?

Benfield: Many of the green cryptos are marketing their blockchain as energy efficient because this is better than saying that they have underdeveloped networks that nobody is using, validating or mining on. That being said, proof-of-stake cryptocurrencies are typically much more energy efficient and new projects will likely shift their attention toward proof of stake because of the energy benefits.

MarketWatch: Will bitcoin evolve and grow to surpass its hunger for energy? Whats next in store for the worlds most popular cryptocurrency?

Benfield: Much of bitcoins energy use to date has been for mining new coins and not the actual processing of transactions. After all the coins have been mined, energy usage is likely to come down, as the act of validating transactions uses far less energy than coin mining. There is also the possibility that scaling solutions and upgrades that have been in the works for years could help cut down on energy expenditure by offloading some transaction processing to layer 2s or sidechains. These sidechains or layer 2s would then checkpoint on the bitcoin blockchain, but similar to the lightning solution, individual transactions would be handled off the main chain and the summaries of those transactions would be stored on the main bitcoin blockchain during those checkpoints.

MarketWatch: Finally, is this energy issue big enough to jeopardize bitcoin and cryptocurrencies as a store of value?

Benfield: No, at the end of the day the issue of bitcoins energy consumption boils down to whether the consumption is worth it. Bitcoins adopters will eventually need to demonstrate bitcoins societal value to the world to justify its energy footprint.

There you have it. After having this conversation with Alex, reviewing Nic Carters research (the link is above, I highly recommend you read it) and other papers on the topic, it seems that many of the issues concerning bitcoins carbon footprint may have been overblown or simply misrepresented.

Determining bitcoins effect on the environment requires a lot of big-picture thinking. It is easy to miss the forest for the trees, and easier still to rely on information that has been since debunked, simply because it favors ones cognitive bias.

The way I see it, cryptocurrencies arent going away, and by the looks of it, neither is bitcoin. Current market action looks like nothing out of the ordinary yet more volatile crypto action, the likes of which weve seen in the past. This slump is likely just a pause.

What do you think? Do you support the use of bitcoin or would you rather invest in one of the green cryptocurrencies? Which one?

Let me know in the comment section below.

Here is the original post:
Bitcoin is greener than many --- including Elon Musk --- think it is - MarketWatch

Read More..

Bitcoin Price Volatility Reached Its Highest In A Year During May – Forbes

Bitcoin volatility hit a 13-month high in May amid sharp fluctuations. (Photo illustration by Edward ... [+] Smith/Getty Images)

Bitcoin prices had a wild May, experiencing sharp gyrations while they lost close to half their value in a matter of weeks.

The digital currencys annualized 30-day volatility reached 116.62% on May 24, its highest since April 10, 2020, data provided by asset manager Blockforce Capital reveals.

This particular measure climbed to its loftiest reading in more than 13 months shortly after bitcoin made two separate attempts to break through the $30,000 level, CoinDesk data shows.

The digital asset fell to roughly $30,000 on May 19, and then after recovering to nearly $42,000 a few days later, it declined once again, dropping below $31,200, additional CoinDesk figures reveal.

[Ed note: Investing in cryptocoins or tokens is highly speculative and the market is largely unregulated. Anyone considering it should be prepared to lose their entire investment.]

Bitcoin experienced some strength earlier in the month, approaching $60,000 on May 8 and trading near that price level for a few days.

However, the worlds most prominent digital currency encountered some inevitable volatility, breaking through the $50,000 and $40,000 levels, before making attempts on $30,000.

By the time it reached its intra-month low of approximately $30,200, it had declined more than 47% from its May high of more than $59,500.

Bull Market

While bitcoin did suffer some notable price declines last month, which helped fuel the digital assets volatility, these developments took place after the cryptocurrency experienced some very impressive gains.

The digital asset rose to nearly $65,000 in April, setting a fresh, all-time high more than triple the size of the prior high of nearly $20,000 reached during the 2017-2018 bull run.

Further, bitcoin rose to this latest high after bottoming out near $3,000 in late 2018, languishing during the so-called Crypto Winter, where digital asset prices suffered and industry projects struggled to get the funding they needed.

As for where the digital currency will go next, its anyones guess, but many market observers have been pointing out that this bull run is different from the last one, driven by separate set of circumstances.

Whereas the sharp price gains that bitcoin enjoyed in 2017 and early 2018 were attributed to variables like retail interest and particularly strong sentiment, institutional investors have been credited with playing a key role in the digital currencys upward movement during the current bull market.

Disclosure: I own some bitcoin, bitcoin cash, litecoin, ether and EOS.

Go here to see the original:
Bitcoin Price Volatility Reached Its Highest In A Year During May - Forbes

Read More..