Page 1,549«..1020..1,5481,5491,5501,551..1,5601,570..»

Bitcoin’s 2023 rally gathers steam as cryptocurrency briefly tops $23,000 – CNBC

  1. Bitcoin's 2023 rally gathers steam as cryptocurrency briefly tops $23,000  CNBC
  2. Crypto Markets Analysis: Bitcoin's Surge Moves Both Short- and Long-Term Holders Into Profitability  CoinDesk
  3. BTC metrics exit capitulation 5 things to know in Bitcoin this week  Cointelegraph

Read more:
Bitcoin's 2023 rally gathers steam as cryptocurrency briefly tops $23,000 - CNBC

Read More..

What is Data Science? – GeeksforGeeks

Data Science is an interdisciplinary field that focuses on extracting knowledge from data sets which are typically huge in amount. The field encompasses analysis, preparing data for analysis, and presenting findings to inform high-level decisions in an organization. As such, it incorporates skills from computer science, mathematics, statistics, information visualization, graphic, and business.

Data is everywhere and is one of the most important features of every organization that helps a business to flourish by making decisions based on facts, statistical numbers, and trends. Due to this growing scope of data, data science came into picture which is a multidisciplinary IT field, and data scientists jobs are the most demanding in the 21st century. Data analysis/ Data science helps us to ensure we get answers for questions from data. Data science, and in essence, data analysis plays an important role by helping us to discover useful information from the data, answer questions, and even predict the future or the unknown. It uses scientific approaches, procedures, algorithms, the framework to extract the knowledge and insight from a huge amount of data.Data science is a concept to bring together ideas, data examination, Machine Learning, and their related strategies to comprehend and dissect genuine phenomena with data. It is an extension of data analysis fields such as data mining, statistics, predictive analysis. It is a huge field that uses a lot of methods and concepts which belong to other fields like in information science, statistics, mathematics, and computer science. Some of the techniques utilized in Data Science encompasses machine learning, visualization, pattern recognition, probability model, data engineering, signal processing, etc.Few important steps to help you work more successfully with data science projects:

Data scientists straddle the world of both business and IT and possess unique skill sets. Their role has assumed significance thanks to how businesses today think of big data. Business wants to make use of the unstructured data which can boost their revenue. Data scientists analyze this information to make sense of it and bring out business insights that will aid in the growth of the business.

Now, lets get started with the foremost topic i.e., Python Packages for Data Science which will be the stepping stone to start our Data Science journey. A Python library is a collection of functions and methods that allow us to perform lots of actions without writing any code.1. Scientific Computing Libraries:

2. Visualization Libraries:

3. Algorithmic Libraries:

{data: array([[ 0., 0., 5., , 0., 0., 0.],[ 0., 0., 0., , 10., 0., 0.],[ 0., 0., 0., , 16., 9., 0.],,[ 0., 0., 1., , 6., 0., 0.],[ 0., 0., 2., , 12., 0., 0.],[ 0., 0., 10., , 12., 1., 0.]]), target: array([0, 1, 2, , 8, 9, 8]), target_names: array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9]), images: array([[[ 0., 0., 5., , 1., 0., 0.],[ 0., 0., 13., , 15., 5., 0.],[ 0., 3., 15., , 11., 8., 0.],,[ 0., 4., 11., , 12., 7., 0.],[ 0., 2., 14., , 12., 0., 0.],[ 0., 0., 6., , 0., 0., 0.]],[[ 0., 0., 0., , 5., 0., 0.],[ 0., 0., 0., , 9., 0., 0.],[ 0., 0., 3., , 6., 0., 0.],,[ 0., 0., 1., , 6., 0., 0.],[ 0., 0., 1., , 6., 0., 0.],[ 0., 0., 0., , 10., 0., 0.]],[[ 0., 0., 0., , 12., 0., 0.],[ 0., 0., 3., , 14., 0., 0.],[ 0., 0., 8., , 16., 0., 0.],,[ 0., 9., 16., , 0., 0., 0.],[ 0., 3., 13., , 11., 5., 0.],[ 0., 0., 0., , 16., 9., 0.]],,[[ 0., 0., 1., , 1., 0., 0.],[ 0., 0., 13., , 2., 1., 0.],[ 0., 0., 16., , 16., 5., 0.],,[ 0., 0., 16., , 15., 0., 0.],[ 0., 0., 15., , 16., 0., 0.],[ 0., 0., 2., , 6., 0., 0.]],[[ 0., 0., 2., , 0., 0., 0.],[ 0., 0., 14., , 15., 1., 0.],[ 0., 4., 16., , 16., 7., 0.],,[ 0., 0., 0., , 16., 2., 0.],[ 0., 0., 4., , 16., 2., 0.],[ 0., 0., 5., , 12., 0., 0.]],[[ 0., 0., 10., , 1., 0., 0.],[ 0., 2., 16., , 1., 0., 0.],[ 0., 0., 15., , 15., 0., 0.],,[ 0., 4., 16., , 16., 6., 0.],[ 0., 8., 16., , 16., 8., 0.],[ 0., 1., 8., , 12., 1., 0.]]]), DESCR: .. _digits_dataset:nnOptical recognition of handwritten digits datasetnnn**Data Set Characteristics:**nn :Number of Instances: 5620n :Number of Attributes: 64n :Attribute Information: 88 image of integer pixels in the range 0..16.n :Missing Attribute Values: Nonen :Creator: E. Alpaydin (alpaydin @ boun.edu.tr)n :Date: July; 1998nnThis is a copy of the test set of the UCI ML hand-written digits datasetsnhttps://archive.ics.uci.edu/ml/datasets/Optical+Recognition+of+Handwritten+DigitsnnThe data set contains images of hand-written digits: 10 classes whereneach class refers to a digit.nnPreprocessing programs made available by NIST were used to extractnnormalized bitmaps of handwritten digits from a preprinted form. From antotal of 43 people, 30 contributed to the training set and different 13into the test set. 3232 bitmaps are divided into nonoverlapping blocks ofn4x4 and the number of on pixels are counted in each block. This generatesnan input matrix of 88 where each element is an integer in the rangen0..16. This reduces dimensionality and gives invariance to smallndistortions.nnFor info on NIST preprocessing routines, see M. D. Garris, J. L. Blue, G.nT. Candela, D. L. Dimmick, J. Geist, P. J. Grother, S. A. Janet, and C.nL. Wilson, NIST Form-Based Handprint Recognition System, NISTIR 5469, n1994.nn.. topic:: Referencesnn C. Kaynak (1995) Methods of Combining Multiple Classifiers and Theirn Applications to Handwritten Digit Recognition, MSc Thesis, Institute ofn Graduate Studies in Science and Engineering, Bogazici University.n E. Alpaydin, C. Kaynak (1998) Cascading Classifiers, Kybernetika.n Ken Tang and Ponnuthurai N. Suganthan and Xi Yao and A. Kai Qin.n Linear dimensionalityreduction using relevance weighted LDA. School ofn Electrical and Electronic Engineering Nanyang Technological University.n 2005.n Claudio Gentile. A New Approximate Maximal Margin Classificationn Algorithm. NIPS. 2000.}

See the original post:

What is Data Science? - GeeksforGeeks

Read More..

12 Biggest Cloud Providers by Market Share in the World

In this article, we take a look at 12 biggest cloud providers by market share in the world. If you want to see more biggest cloud providers by market share in the world, go directly to 5 Biggest Cloud Providers by Market Share in the World.

Cloud computing is a growing sector.

In cloud computing, there are two sectors, both of which are growing, cloud infrastructure as a service or IaaS, and software as a service or SaaS.

In terms of cloud infrastructure as a service, there aren't that many big infrastructure as a service cloud providers in the world given the substantial upfront capital needed to be competitive in the field. Considering the cloud infrastructure as a service is very competitive with many of the biggest players constantly cutting their fees for a unit of processing, only the biggest players are profitable. As the leader, Amazon Web Services accounts for a substantial percentage of Amazon's total profits. Meanwhile, big cloud providers such as Google Cloud are still losing money in an effort to grow their market share.

In terms of the cloud software as a service market, the market is arguably less competitive as it is more segmented. Given many customers subscribe to software as a service providers, there is also recurring revenue potential.

Third Quarter 2022 Cloud Infrastructure Services Industry Growth

While it is very competitive, the cloud infrastructure services market is huge.

According to the Synergy Research Group, total enterprise spending on cloud infrastructure services in the third quarter of 2022 rose more than $11 billion from Q3 2021 to over than $57 billion. During the quarter, the cloud market at the top was very concentrated as the top three players of Amazon, Microsoft, and Google have around two thirds market share globally of the cloud infrastructure services market.

Chief analyst at Synergy Research Group John Dinsdale said of the Q3 industry results, It is a strong testament to the benefits of cloud computing that despite two major obstacles to growth the worldwide market still expanded by 24% from last year. Had exchange rates remained stable and had the Chinese market remained on a more normal path then the growth rate percentage would have been well into the thirties. The three leading cloud providers all report their financials in US dollars so their growth rates are all beaten down by the historic strength of their home currency."

Story continues

In terms of their individual growth rates, Amazon Web Services expanded by 27.5% year over year, Microsoft's Azure expanded 35% year over year and Google Cloud Platform grew 38% year over year.

Given the growth in the third quarter, the trailing twelve month revenues of the cloud infrastructure services market reached $217 billion according to Synergy Research Group estimates.

Given rising interest rates and macroeconomic uncertainty with a potential recession, some enterprises have become more cautious in their spending and the overall industry spend for the period was less than some estimates.

The Future Growth

Given the tailwinds in the industry in the long term, analysts expect the cloud infrastructure as a service market to grow substantially in the future. According to Research and Market, the global cloud computing market could grow by an average CAGR of 15.7% from 2022 to 2030 to surpass over $1.55 trillion by 2030 as more firms shift to cloud computing for cost savings and flexibility.

One reason for substantially more expected growth is AI processing. As AI applications like ChatGPT gain more popularity, demand for cloud computing could grow further. Considering the amount of cloud computing power needed to make an application as sophisticated as ChatGPT, big tech cloud providers are among the few companies currently that can offer the processing needed to make an application like ChatGPT.

In 2019, Microsoft invested $1 billion into OpenAI in the form of cash and credits for Microsoft's Azure cloud computing platform. Given ChatGPT has experienced substantial demand since its launch, the company is reportedly looking to raise more capital at around a $30 billion valuation. Once it raises more capital, a substantial amount will likely be spent on cloud processing to make ChatGPT even better, which could bring more business for Microsoft.

Methodology

For our list of 12 Biggest Cloud Providers by Market Share in the World, we list only the market shares of the cloud infrastructure as a service market and not the software as a service market because they are two different markets.

We ranked the cloud providers based on their market shares in the cloud infrastructure service market which includes platform as a service, infrastructure as a service and hosted private cloud services according to Synergy Research Group in Q3 2022.

Because their infrastructure as a service market share is under 2%, we just ranked Dell Cloud, VMware Cloud, Huawei Cloud, and Baidu Cloud as <2% in market share.

For those of you interested, also check out 15 Most Innovative Companies in the World.

Worldwide Market Share in Q3 2022: <2%

Baidu AI Cloud is the cloud compute service of Chinese internet search company Baidu. According to the company it provides compute & storage, network & CDN, database, big data, and security services. In 2021, Baidu AI Cloud had around 9% market share in the China Cloud infrastructure services spend market, ranking fourth after Alibaba Cloud, Huawei Cloud, and Tencent Cloud. Because China's cloud market size is substantial, Baidu AI Cloud is one of the larger IaaS cloud companies globally.

Worldwide Market Share in Q3 2022: <2%

VMware's cloud infrastructure services is one of the products of VMware, which is a leading provider of multi-cloud services that enables digital innovation with enterprise control. Originally a part of Dell Technologies after the latter's acquisition of EMC Corp in 2015, VMware spun off from Dell Technologies in November of 2021 so that the company could have more strategic flexibility to potentially grow faster by gaining more freedom to invest in cloud computing.

Worldwide Market Share in Q3 2022: <2%

In addition to making PCs, Dell Technologies also provides infrastructure as a service with its Dell Technologies Cloud. According to the company, "The Dell Technologies Cloud IaaS model has been invaluable for enterprises. By provisioning and making fully managed servers, storage, networking and other compute resources accessible via the Internet, IaaS offerings allow businesses to avoid the cost and complexity of purchasing and managing their own infrastructure."

Worldwide Market Share in Q3 2022: <2%

Huawei Cloud is the cloud infrastructure computing service provided by Chinese technology company Huawei. Given its market share in China, Huawei Cloud could potentially rank higher on this list but we included it at #9 because Synergy Research Group didn't rank it among the top eight cloud infrastructure service providers globally in Q3 2022. According to China Internet Watch, Huawei Cloud had 18% market share in the China cloud infrastructure service spend in 2021, more than Tencent Cloud's 16%, and Baidu AI Cloud's 9%. Given the company is private, however, it is difficult to determine how much market share it has currently but it certainly ranks among the leaders.

Worldwide Market Share in Q3 2022: 2%

Oracle Cloud is database and enterprise software maker's Oracle Corporation's IaaS offering where it has around a 2% global market share. In Q2 fiscal 2023, Oracle Corporation's IaaS sales rose 53% year over year to $1 billion, which is actually a little bit faster than the overall industry's growth. One reason for the fast growth could be that TikTok routed 100% of the app's US user traffic to Oracle Cloud Infrastructure as of June 2022. TikTok previously used its own data centers for US traffic.

Worldwide Market Share in Q3 2022: 2%

Tencent Cloud is a cloud computing service provided by Chinese social media giant Tencent. According to the company, Tencent cloud has numerous software developers from many different industries using it including Tencent's own WeChat messaging application. In terms of market share, Tencent Cloud had a 16% market share in the China cloud infrastructure services industry in 2021 and an around 2% market share the cloud infrastructure service market worldwide in Q3 2022.

Worldwide Market Share in Q3 2022: 3%

In addition to being a leading SaaS provider, Salesforce Inc offers a platform as a service solution that "allows businesses to easily deploy, run, and manage custom cloud applications without the complexity of building and maintaining their own servers and infrastructure." With its Salesforce Platform and other services, Salesforce has a worldwide market share of 3% in the cloud infrastructure service market in Q3 2022.

Click to continue reading and see 5 Biggest Cloud Providers by Market Share in the World.

Suggested articles:

Disclosure: None. 12 Biggest Cloud Providers by Market Share in the Worldisoriginally published on Insider Monkey.

Continue reading here:
12 Biggest Cloud Providers by Market Share in the World

Read More..

Cloud egress costs: What they are and how to dodge them

The clouds pay-as-you-go model offers flexibility and an easy way to expand data storage.

But, although most cloud providers allow free data uploads to their infrastructure, downloading or even moving data from cloud storage comes at a cost.

Those fees, or egress charges, are one of the hidden costs of cloud computing, and can quickly mount up. In the most extreme cases, egress charge bill shock can make a cloud project so expensive that its no longer viable.

Cloud egress charges are a fee for network usage. They are any costs associated with moving data out of the cloud storage platform where the data is normally held, says Tony Lock at analyst Freeform Dynamics.

As such, egress charges are more than just a fee for downloads. Service providers can levy fees whenever data moves from a cloud storage platform, including to another cloud provider, to another region or availability zone, or even between applications.

One example is where a business moves data from archives to an analytics application. The CSP that hosts the archives will charge egress fees because the data leaves its storage, even though uploads to the analysis package are free.

And, warns Lock, some providers will levy egress charges to move data from storage into memory for example, for searches. In some circumstances, software-as-a-service (SaaS) applications will add their egress charges for downloading data.

The charges are also asymmetrical. Cloud providers rarely charge for uploading data or data ingress. Any costs they incur to bring data into their networks is wrapped up in subscription or other fees.

Rather like a supermarket that offers discounted goods as a loss leader, the cloud provider needs to offer cheap or free ingress to encourage customers to use their cloud.

Egress charges work the other way, by discouraging firms from transferring data out, either to other cloud providers, or to on-premise systems.

Theyve made the commercial decision that ingress should be effectively absorbed within the consolidated cost of service represented in the unit prices of cloud components, but egress charges are separated out, says Adrian Bradley, head of cloud transformation at consulting firm KPMG. At the heart of that, it is a real cost. The more a client consumes of it, the more it costs the cloud providers.

Firms have seen egress charges rise as they look to do more with their data, such as mining archives for business intelligence purposes, or to train artificial intelligence (AI) engines. Data transfers can also increase where organisations have a formalised hybrid or multi-cloud strategy.

Either theres a need to do a lot more data egress, or perhaps theres just simply the positive use of cloud to develop new products and services that intrinsically use more data, says Bradley.

The result is that firms are moving more data from cloud storage, and are being hit by increasing costs. Research by Aptum Technologies, a managed service provider, found that moving to the cloud resulted in higher-than-expected costs for 73% of firms, with 65% saying they had wasted money through inefficiencies in the cloud.

For chief information officers (CIOs), the risk from cloud egress fees is less the actual cost than their unpredictable and potentially uncontrollable nature.

Research by IDC estimates that planned and unplanned egress charges account for an average 6% of organisations cloud storage costs, itself a relatively small percentage. But that could still be enough to undermine the viability of a cloud storage project. And within that average, some firms will be paying more.

Data egress costs matter because, unlike subscriptions, they are not fixed and usually not negotiated in advance. Organisations can find that egress costs rack up because the business has changed its IT strategy, made an acquisition, entered a new market or come under regulations that force it to relocate data.

Even measures that bring efficiencies elsewhere such as improved forecasting or machine learning can push up egress cloud costs. In some cases, they can tip the balance between cloud or on-premise deployments.

Egress charges can also stand in the way of making cloud deployments more resilient because they add to the running costs of hybrid and multi-cloud architectures. And, as its a consumption-based charge, the more successful the cloud deployment, the higher the egress charges can be.

These costs typically cant be covered by a customers spend commitment. Theyre on top, which makes them even more unwelcome, says Patrick Smith, field chief technology officer for EMEA at storage supplier Pure.

This is made worse by a lack of transparency around egress fees. Although the charges are by no means new, their complexity makes them hard to predict and model. At KPMG, Bradley points to firms that suffer bill shock because they failed to carry out a detailed enough analysis of workloads before moving to the cloud.

But the second kind of bill shock comes from where patterns of consumption in a cloud environment evolve quite quickly, he says.

And there is a further risk. Firms that face unexpected egress charges might shy away from making full use of cloud-based data and lose competitive advantage as a result.

Strategies to reduce egress charges can be technical and architectural, or contractual. IT departments can try demand management, to limit cloud storage and data transfers. However, to micromanage usage in a dynamic cloud environment is itself costly. And putting hard limits on data downloads, for example, risks breaking business processes further downstream.

Instead, it is better to choose workloads carefully and design cloud architectures to maximise efficiencies. Examples include reducing inter-regional data transfers, deploying data deduplication and compression and rewriting data-intensive applications so they make fewer calls on cloud storage such as by only downloading data differences or deltas.

But contractual measures are as important.

Firms can negotiate to include egress, or some egress, into their subscription costs or try to reduce regional transfer charges. And it can pay to pay more for some services. Moving archived data to a tier suited to more frequent access can cost less than paying additional fees to retrieve it from cold storage.

Make sure you know exactly what data you have stored in each cloud service, especially cold systems where egress charges might mount up quickly, if the original assumption was that the data would not be recovered except in emergency, says Lock.

As more organisations look to use historical data in routine operational analytics, it might be time to consider just how cold most data really is. These factors all highlight the growing importance of holding much more detailed metadata than we have ever done before.

However, there is no industry standard formula to calculate when egress charges mean its no longer economical to store data in the cloud. This depends on the use case and the value of data. Repatriating data to on-premise systems brings its own costs.

And, although cloud management tools are improving and firms are becoming better at understanding their data flows, this analysis is still not easy.

Nonetheless, KPMGs Bradley advises that CIOs can take three steps to control egress fees.

One, really do the detailed analysis before you move, he says. Two, be bold in looking at your architecture and rethinking at least some elements, whether moving workloads to a different place, a content delivery network, or caching, as thats what makes a structural difference. Third, make sure you have good visibility, so you know what youre spending on that egress and managing it tightly.

Here is the original post:
Cloud egress costs: What they are and how to dodge them

Read More..

AWS is spending $35 billion on one of its most troubled US cloud …

Amazon Web Services (AWS) has revealed plans to invest $35 billion in the US state of Virginia to expand its operations between now and 2040.

Virginia is home to the companys US-EAST-1 region, which has suffered significant outages and other issues in recent years, leading some to cite it as a major cause for concern in Amazons ability to handle crises.

AWS has been operating in the state since 2006, and later chose Virginia to open a second headquarters in 2018 (which it calls HQ2).

Governor of Virginia, Glenn Youngkin, announced (opens in new tab) the plans which he believes will create at least 1,000 new jobs in the area as he continued to express a clear interested in the industry within his state:

Virginia will continue to encourage the development of this new generation of data center campuses across multiple regions of the Commonwealth. These areas offer robust utility infrastructure, lower costs, great livability, and highly educated workforces and will benefit from the associated economic development and increased tax base, assisting the schools and providing services to the community.

Reuters (opens in new tab) also reports that the cloud storage giant had already invested $35 billion in northern Virginia data centers between 2011 and 2020, meaning that the continued investment could see a huge boost to the local economy.

Director of Economic Development at AWS, Roger Wehner, highlighted that the companys continued investment in the area since 2006 has already boosted the Commonwealths GDP by nearly $7 billion, accounting for thousands of jobs.

In time to come, AWS will be eligible for a new Mega Data Center Incentive Program, should it be approved by the Virginia General Assembly. This would include a 15-year maximum extension of Data Center Sales and Use tax exemptions on qualifying equipment and enabling software, and a further grant worth up to $140 million for site and infrastructure improvements, workforce development, and other project-related costs.

Via The Register (opens in new tab)

See original here:
AWS is spending $35 billion on one of its most troubled US cloud ...

Read More..

A Guide to Deployment Models: On-Premise, Cloud, and Disconnected

The days when every service was managed from a single server or location are long gone. When organizations build software and services, they have a variety of options depending on their budget, infrastructure, and security needs. Now teams can build software across a broad network of machines and software managed either by their own staff or external services.

What tools to use is a complex question and there is no one-size-fits-all for every organization.

Today, weve put together some analysis of the various pros and cons of some common cloud and on-premise options. Understanding the options for your team can help you make the right choice and save money and time in the long term.

With on-premise software, the software is usually located and operated within the organization's computing hardware.On-premise deployments are set up and maintained by the organization on its own network.

Security: By keeping everything in-house and cutting exposure to the outside world, organizations can retain all their data. Which is especially important if they are in a regulated industry. Teams can implement their own security policies to remain compliant and keep their data protected. And on-premise deployment allows them to establish a perimeter, which is inherently more secure than cloud hosting

User Control: In a more regulated industry, its important to maintain greater control of the environment and assets. Ideal for organizations that want to keep full control of their uptime, infrastructure, and prefer to maintain the software themselves.

Initial Cost: Depending on the size of deployment, an on-premise solution can be cheaper than a private cloud solution, especially in the short term.

If on-premise is already in use: While cloud is the most sleek and modern deployment model, it can actually cause problems if a company is already hosting a variety of other software on-premise and then choose to host one tool in the cloud. Organizations with a robust DevOps infrastructure that are used to deploying in-house will find it easier to continue adding on-premise solutions. Deploying to the cloud carries benefits but isn't a cure-all. It requires careful consideration for each unique environment.

Human Capital Requirements: On-premise solutions require dedicated IT support year-round to ensure the software is running properly.

Mobility: Unlike cloud-hosting, an on-premise deployment is not travel friendly - if members of the team are expected to be out-of-the-office or out of the approved field of use, an on-premise system may not be flexible enough to continue working while on the go.

Hidden Spend: Sometimes there can be hidden expenses associated with an on-premise solution. Anything from server space, to upgrades, maintenance, patches, and even natural disasters. This can end up costing an organization in labor and budget.

Infrastructure and Maintenance: From server space, deployment, downtimes, backups, and updates, an on-premise solution requires a lot of upkeep on the organization's end to keep it running like a well-oiled machine.

Scalability: On-premise means the need to know the approximate size and provisioning needs of an operation. This is where it pays off to be forward-thinking and estimate a business's potential growth far into the future. An on-premise system cannot scale as quickly and easily as a cloud solution, which has the potential to hinder a teams speed of growth. Choosing on-premise means knowing the organization's needs before its too late.

Cloud deployments consist of a remote network of servers that are connected togetherand used to store, process, and manage data remotely. Unlike an on-premise or disconnected deployment, these are often managed by a third party.

Cost savings: Because cloud providers typically offer a pay-as-you-scale model, organizations can save money on hardware, maintenance, and other upfront costs.

Scalability: Cloud systems are highly scalable, which means that organizations can easily increase or decrease their usage as needed. This can be especially useful for those that experience fluctuating demand for their services.

Accessibility: With a cloud system, users can access their data and applications from any device with an internet connection. This can be especially useful for remote teams or organizations with employees in different locations.

Reliability: Cloud providers typically have robust infrastructure and backup systems in place to ensure that their services are always available. This can provide peace of mind for organizations that rely on their systems to run their business.

Security: Many cloud providers offer advanced security measures to protect their customers' data, which can be more secure than maintaining these measures on-premises. If its not a highly regulated industry, cloud solutions are plenty secure for the majority of teams.

Externally Managed: Because cloud providers are responsible for maintaining and updating their systems, organizations that use the cloud can take advantage of new features and innovations without having to invest time or resources. Cloud allows teams to put more dollars and hours towards innovation.

Limited control: When multiple customers share the same instance of the software, there is often less control over it. This lessens the room for customization thats found in a single-tenant system. This can be a problem for organizations with very specific requirements or unique needs.

Shared resources: In a multi-tenant system, the resources of the software (such as processing power, memory, and storage) are shared among all of the customers. This means that if one customer experiences a high level of usage, it could potentially impact the performance of the software for other customers.

Security concerns: Some organizations may have concerns about the security of a multi-tenant system, as they are sharing the same instance of the software with other customers. While a security breach is unlikely, it is still less secure than an on-premise or disconnected deployment. Choosing the right provider and service can ensure a more secure cloud.

Misconfiguration - Just as with on-premise service, effective policy and administration is important. However, because of the ease of access and high availability, mistakes in the cloud can quickly spiral into disasters. After all, turning on sharing for a file or server enables it for the whole world.

Integration - Although this is constantly improving, cloud services often dont have the same connections that on-premise servers have. This is especially true for some compliance standards and legacy systems.

Cloud computing is the current wave of software delivery, and many services today are offered through varying cloud service models. One of the primary divisions is whether an organization draws from a collective resource pool or gets a specific service commitment (private vs. public).

In a private cloud service model, also known as a single-tenant cloud service, each customer has their own dedicated instance of the software, This means the customer has complete control over their instance of the software, including the ability to customize it to their specific needs and requirements.

On the other hand, in a public cloud service model, businesses deliver a multi-tenant cloud service where multiple customers share the same instance of the software. You may see this as SaaS, Software as a Service. This means that the software provider is responsible for maintaining and managing the software, which can be more efficient and cost-effective for the customers.

However, it also means that the customers do not have as much control over the software and may not be able to customize it as much as they would in a single-tenant system. There are some concerns that a multi-tenant solution is less secure; if customers choose a multi-tenant solution from an organization with good security policies, it can be just as secure as a single-tenant deployment.

Overall, the choice between single and multi-tenant cloud software options depends on the specific needs and priorities of the customer:

A disconnected deployment is exactly what it sounds like. These on-premise deployments are cut off from the public internet for security and compliance reasons.

These are also referred to as air gapped, meaning a clear physical separation to create a private network. This isolation may avoid connections even from networks within the same organization.

Security: Disconnected environments are hands down the most secure deployment model. In a highly regulated industry, organizations can benefit from the iron walls a disconnected deployment has around data or intellectual property. This helps address both known attacks and theoretical threats.

Compliance Requirements: Disconnected deployments are the only model compatible with air-gapped developer environments.

Cost: A disconnected solution will cost teams the most. Due to the additional infrastructure resources required, disconnected deployments are not the most cost-effective.

Data Lag: While any disconnected deployments worth their salt will ensure daily data updates, some updates may take longer. For the majority of teams, this is not a major concern, but if real-time data is a priority for an organization, disconnected deployments may be too slow for their needs.

Infrastructure and Maintenance: Ensure the right infrastructure is in place before buying a disconnected solution. From server space to compute and administrative resources, there is an additional cost to running a disconnected deployment.

No deployment model is perfect, and all come with their challenges. An organization needs to know which capabilities will best suit its specific needs. A general understanding of which direction to proceed can save time and research.

To sum up what weve discussed so far:

More questions about deployment models? Were always happy to chat. Schedule a demo with one of our experts to learn more.

See original here:
A Guide to Deployment Models: On-Premise, Cloud, and Disconnected

Read More..

Webb telescope spies frozen molecules inside space cloud | CNN

Sign up for CNNs Wonder Theory science newsletter. Explore the universe with news on fascinating discoveries, scientific advancements and more.

CNN

The James Webb Space Telescope peered inside a wispy molecular cloud located 630 light-years away and spied ices made of different elements.

Molecular clouds are interstellar groupings of gas and dust where hydrogen and carbon monoxide molecules can form. Dense clumps within these clouds can collapse to form young stars called protostars.

The Webb telescope focused on the Chamaeleon I dark molecular cloud, which appears blue in the new image. A young protostar, called Ced 110 IRS 4, glows in orange to the left. The journal Nature Astronomy published a study including the image on Monday.

More orange dots represent light from stars in the background, piercing through the cloud. The starlight helped astronomers determine the diverse range of frozen molecules within the Chamaeleon I dark molecular cloud, which is forming dozens of young stars.

The Webb telescope views the universe through infrared light, which is invisible to the human eye. Infrared light can reveal previously hidden aspects of the cosmos and pierce dense clusters of gas and dust that would otherwise obscure the view.

Astronomers have used the space observatory to discover a diverse range of some of the coldest ices in the darkest regions of a molecular cloud to date. During a survey of the cloud, the international research team identified water ice, as well as frozen forms of ammonia, methanol, methane and carbonyl sulfide.

These icy molecules could contribute to the formation of stars and planets and even the building blocks of life.

Ices can supply planets with carbon, hydrogen, oxygen, nitrogen and sulfur, which could lead to the formation of a habitable planet like Earth, where they are used in planetary atmospheres as well as amino acids, sugars and alcohols.

Our results provide insights into the initial, dark chemistry stage of the formation of ice on the interstellar dust grains that will grow into the centimeter-sized pebbles from which planets form in disks, said lead study author Melissa McClure, an astronomer and assistant professor at Leiden Observatory in the Netherlands, in a statement. McClure is the principal investigator of the observing program.

These observations open a new window on the formation pathways for the simple and complex molecules that are needed to make the building blocks of life.

In addition to simple molecules, the researchers saw evidence of more complex molecules.

Our identification of complex organic molecules, like methanol and potentially ethanol, also suggests that the many star and planetary systems developing in this particular cloud will inherit molecules in a fairly advanced chemical state, said study coauthor Will Rocha, an astronomer and postdoctoral fellow at Leiden Observatory, in a statement.

This could mean that the presence of precursors to prebiotic molecules in planetary systems is a common result of star formation, rather than a unique feature of our own solar system.

Astronomers used starlight filtering through the cloud to search for chemical fingerprints and identify the elements.

We simply couldnt have observed these ices without Webb, said study coauthor Klaus Pontoppidan, Webb project scientist at the Space Telescope Science Institute in Baltimore, in a statement.

The ices show up as dips against a continuum of background starlight. In regions that are this cold and dense, much of the light from the background star is blocked, and Webbs exquisite sensitivity was necessary to detect the starlight and therefore identify the ices in the molecular cloud.

Go here to see the original:
Webb telescope spies frozen molecules inside space cloud | CNN

Read More..