Page 4,131«..1020..4,1304,1314,1324,133..4,1404,150..»

Encryption, Key Management – bank information security

Please fill out the following fields:

CountryUnited StatesCanadaIndiaAfghanistanAlbaniaAlgeriaAmerican SamoaAndorraAngolaAnguillaAntarcticaAntigua and BarbudaArgentinaArmeniaArubaAustraliaAustriaAzerbaijanBahamasBahrainBangladeshBarbadosBelarusBelgiumBelizeBeninBermudaBhutanBoliviaBosnia and HerzegovinaBotswanaBouvet IslandBrazilBritish Indian Ocean Trty.Brunei DarussalamBulgariaBurkina FasoBurundiCambodiaCameroonCape VerdeCayman IslandsCentral African RepublicChadChileChinaChristmas IslandCocos (Keeling) IslandsColombiaComorosCongoCook IslandsCosta RicaCote D'IvoireCroatiaCubaCyprusCzech RepublicDenmarkDjiboutiDominicaDominican RepublicEast TimorEcuadorEgyptEl SalvadorEquatorial GuineaEritreaEstoniaEthiopiaFalkland Islands (Malvinas)Faroe IslandsFijiFinlandFranceFrance, MetropolitanFrench GuianaFrench PolynesiaFrench Southern TerritoriesGabonGambiaGeorgiaGermanyGhanaGibraltarGreeceGreenlandGrenadaGuadeloupeGuamGuatemalaGuineaGuinea-BissauGuyanaHaitiHondurasHong KongHungaryIcelandIndonesiaIran (Islamic Republic of)IraqIrelandIsraelItalyJamaicaJapanJordanKazakhstanKenyaKiribatiKoreaKorea (Democratic)KuwaitKyrgystanLaosLatviaLebanonLesothoLiberiaLibyaLiechtensteinLithuaniaLuxembourgMacauMacedoniaMadagascarMalawiMalaysiaMaldivesMaliMaltaMarshall IslandsMartiniqueMauritaniaMauritiusMayotteMexicoMicronesiaMoldovaMonacoMongoliaMontserratMoroccoMozambiqueMyanmarNamibiaNauruNepalNetherlandsNetherlands AntillesNeutral ZoneNew CaledoniaNew ZealandNicaraguaNigerNigeriaNiueNorfolk IslandNorthern Mariana IslandsNorwayOmanPakistanPalauPanamaPapua New GuineaParaguayPeruPhilippinesPitcairnPolandPortugalPuerto RicoQatarReunionRomaniaRussian FederationRwandaSaint HelenaSaint Kitts and NevisSaint LuciaSaint Pierre and MiquelonSaint Vincent and the GrenadinesSamoaSan MarinoSao Tome and PrincipeSaudi ArabiaSenegalSerbiaSeychellesSierra LeoneSingaporeSlovakiaSloveniaSolomon IslandsSomaliaSouth AfricaSpainSri LankaSudanSurinameSvalbard and Jan MayenSwazilandSwedenSwitzerlandSyrian Arab RepublicTaiwanTajikistanTanzaniaThailandTogoTokelauTongaTrinidad and TobagoTunisiaTurkeyTurkmenistanTurks and Caicos IslandsTuvaluUgandaUkraineUnited Arab EmiratesUnited KingdomUruguayUS Minor Outlying IslandsUzbekistanVanuatuVatican City StateVenezuelaViet NamVirgin Islands (British)Virgin Islands (US)Wallis and FutunaWestern SaharaYemenYugoslaviaZaireZambiaZimbabwe

Title Level Attorney / General Counsel / Counsel AVP Board of Director C Level - Other CCO CEO / President CFO Chairperson CIO CISO / CSO COO CRO CTO Director EVP / SVP / FVP Head Healthcare Professional Manager / Supervisor Partner VP --Other Title Level--

Job Function Anti-Money Laundering (AML) Audit Business Continuity/Disaster Recovery Business Development Cashier / Customer Service / Administrative Clinical Healthcare Professional Compliance / BSA Data Management Debit/Credit Card/Electronic Banking eCommerce / eBusiness Executive Management Finance / Accounting Founder / Owner Fraud HR / Training Information Security Information Technology Legal Lending Loss Prevention Marketing Network / Systems / Web Operations Others Policies / Procedures Product Management Project Regulatory Affairs Risk Management Sales Security / Privacy Vendor Management --Other Job Function--

Subscription Preferences:

Banking

Risk Management

Data Breach

Careers

Subscribe

Read more:
Encryption, Key Management - bank information security

Read More..

What is cloud computing? – Definition from WhatIs.com

Cloud computing is a general term for anything that involves delivering hosted services over the Internet. These services are broadly divided into three categories: Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS). The name cloud computing was inspired by the cloud symbol that's often used to represent the Internet in flowcharts and diagrams.

A cloud service has three distinct characteristics that differentiate it from traditional web hosting. It is sold on demand, typically by the minute or the hour; it is elastic -- a user can have as much or as little of a service as they want at any given time; and the service is fully managed by the provider (the consumer needs nothing but a personal computer and Internet access). Significant innovations in virtualization and distributed computing, as well as improved access to high-speed Internet, have accelerated interest in cloud computing.

A cloud can be private or public. A public cloud sells services to anyone on the Internet. (Currently, Amazon Web Services is the largest public cloud provider.) A private cloud is a proprietary network or a data center that supplies hosted services to a limited number of people. Private or public, the goal of cloud computing is to provide easy, scalable access to computing resources and IT services.

Private cloud services are delivered from a business's data center to internal users. This model offers the versatility and convenience of the cloud, while preserving the management, control and security common to local data centers. Internal users may or may not be billed for services throughIT chargeback. Common private cloud technologies and vendors include VMware and OpenStack.

In the public cloud model, a third-party cloud service provider delivers the cloud service over the internet. Public cloud services are sold on demand, typically by the minute or hour, though long-term commitments are available for many services. Customers only pay for theCPUcycles,storageorbandwidththey consume. Leading public cloud service providers include Amazon Web Services (AWS), MicrosoftAzure,IBMandGoogle Cloud Platform.

A hybrid cloud is a combination of public cloud services and an on-premises private cloud, with orchestration and automation between the two. Companies can run mission-critical workloads or sensitive applications on the private cloud and use the public cloud to handle workload bursts or spikes in demand.The goal of a hybrid cloud is to create a unified, automated, scalable environment that takes advantage of all that a public cloud infrastructure can provide, while still maintaining control over mission-critical data.

In addition, organizations are increasingly embracing a multicloud model, or the use of multiple infrastructure-as-a-service providers. This enables applications to migrate between different cloud providers or to even operate concurrently across two or more cloud providers. Organizations adopt multicloud for various reasons. For example, they could do so to minimize the risk of a cloud service outage or to take advantage of more competitive pricing from a particular provider. Multicloud implementation and application development can be a challenge because of the differences between cloud providers' services and application program interfaces (APIs). Multicloud deployments should become easier, however, as providers' services and APIs converge and become more homogeneous through industry initiatives such as the Open Cloud Computing Interface.

Cloud computing boasts several attractive benefits for businesses and end users. Five of the main benefits of cloud computing are:

Although cloud computing has changed over time, it has been divided into three broad service categories: infrastructure as a service (IaaS), platform as a service (PaaS) and software as a service (SaaS).

IaaS providers, such as AWS, supply avirtual serverinstance and storage, as well as APIs that enable users to migrateworkloadsto aVM. Users have an allocated storage capacity and can start, stop, access and configure the VM and storage as desired. IaaS providers offer small, medium, large, extra-large and memory- or compute-optimized instances, in addition to customized instances, for various workload needs.

In the PaaS model, cloud providers host development tools on their infrastructures. Users access these tools over the internet using APIs, webportalsor gateway software. PaaS is used for general software development, and many PaaS providers host the software after it's developed. Common PaaS providers include Salesforce'sForce.com,AWS Elastic BeanstalkandGoogle App Engine.

SaaS is a distribution model that delivers software applications over the internet; these applications are often calledweb services. Users can access SaaS applications and services from any location using a computer or mobile device that has internet access. One common example of a SaaS application is MicrosoftOffice 365for productivity and email services.

Cloud providers are competitive, and they constantly expand their services to differentiate themselves. This has led public IaaS providers to offer far more than common compute and storage instances.

For example, serverless, or event-driven computing is a cloud service that executes specific functions, such as image processing and database updates. Traditional cloud deployments require users to establish a compute instance and load code into that instance. Then, the user decides how long to run -- and pay for -- that instance.

With serverless computing, developers simply create code, and the cloud provider loads and executes that code in response to real-world events, so users don't have to worry about the server or instance aspect of the cloud deployment. Users only pay for the number of transactions that the function executes. AWS Lambda, Google Cloud Functions and Azure Functions are examples of serverless computing services.

Public cloud computing also lends itself well to big data processing, which demands enormous compute resources for relatively short durations. Cloud providers have responded with big data services, including Google BigQuery for large-scale data warehousing and Microsoft Azure Data Lake Analytics for processing huge data sets.

Another crop of emerging cloud technologies and services relates to artificial intelligence (AI) and machine learning. These technologies build machine understanding, enable systems to mimic human understanding and respond to changes in data to benefit the business. Amazon Machine Learning, Amazon Lex, Amazon Polly, Google Cloud Machine Learning Engine and Google Cloud Speech API are examples of these services.

Security remains a primary concern for businesses contemplating cloud adoption -- especially public cloud adoption. Public cloud service providers share their underlying hardware infrastructure between numerous customers, as public cloud is amulti-tenant environment. This environment demands copious isolation between logical compute resources. At the same time, access to public cloud storage and compute resources is guarded by account login credentials.

Many organizations bound by complexregulatoryobligations andgovernancestandards are still hesitant to place data or workloads in the public cloud for fear of outages, loss or theft. However, this resistance is fading, as logical isolation has proven reliable, and the addition of dataencryptionand various identity and access management tools has improved security within the public cloud.

Here is the original post:
What is cloud computing? - Definition from WhatIs.com

Read More..

7 Best Managed Cloud Hosting Providers of 2018 – WPMyWeb

0

When it comes to hosting something bigger, then we always prefer Cloud Hosting. Cloud Hosting isa new kind of Hosting solution for hosting bigger website or projects. Cloud Hostingfunctionalities are almost same as VPS Hosting butworks in a different way.

If youare a developer or have a blog website, then we highly recommend you to get a Managed WordPress Hosting instead of Cloud Hosting. But if you are running a big project or have multi websites that receive millions of traffic, then Cloud Hosting will be the perfect option. In this topic, we will cover 7Best Cloud Hosting Providers of 2018.

Before we dig into 7Best Cloud Hosting Providers, lets see the difference of a Cloud and VPS hosting.

These days there is a lot of talk about VPS and Cloud Hosting. Many web hosting companies prefer if youhave an organization or business. And there is no doubt that businesses can reap huge benefits from Managed Cloud Hosting. However, it is still confusing either to choose VPS or Cloud Hosting. So, before purchasing, you need to know the key difference between VPS and Cloud Hosting.

VPS also known as Virtual Private Server, is a server with its own copy of Operating System and distributed server resources. You can host your VPS in a virtual environment. For example, if you have a personal computer, that is an example of a VPS. Lets see the Pros and Cons of VPS hosting.

Advantages of VPS Hosting:

Disadvantages of VPS Hosting:

Check, 7 Best VPS Hosting Providers in 2017

In constant, Cloud Servers are connected together in a cluster and backed by SAN storages. Unlike VPS servers, Cloud servers are connected with multiple machines. You can get unlimited storage, bandwidth, and top-notch hardware. Generally, there are more benefits of using a Managed Cloud Hosting.

Advantages of Cloud Hosting:

Disadvantages of Cloud Hosting:

There are many Cloud Hosting Providers available, which provide different features, price, and support. So its hard to choose the Best Cloud Hosting Providers. So we made a list of 7 Best Cloud Hosting Providers, which are less expensive than other hosting companies and provides all features as well.

HostGatoris one of the Best Cloud Hosting Providers that supercharges your site easily. It provides four times extra resources than a shared hosting, so you wouldnt face any resource limit issues of having high traffic. You can monitor and allocate resources easily from your hosting dashboard.

After deploying your virtual server, if a hardware issue arises, your site is automatically switched to another server. This way your server stays 100% online. Besides, theiroptimal caching configuration makes your site faster. However, making the site faster isnt the end, you will also need to track your visitors, page download speed, uptime etc. And all you can do this from your dashboard.

You can get Cloud Hosting at an affordable price. Their basic plan starts from only $4.95/m, which offers 1CPU cores, 2GB memory, Unlimited Storage, Unlimited Data Transfer and you can host your one domain. Each plan includes local caching, powerful SSD drives, cPanel, Data Monitoring tools etc.

Features:

Price: $4.95/m (We are using HostGator Cloud Hosting)

Bluehost is WordPress recommended hosting which provides all kinds of hosting at an affordable price. On just 2015, Bluehost launched their Cloud Hosting. It offers same features and price as HostGator. If you are finding a cheap Cloud Hosting, then Bluehost is a good option and HostGator alternatives too.

They offer free site migration, so if you already own a website, you can easily migrate your existing site. Besides their fully-managed hosting allows you to add additional resources to your needs, fix automatic failover, monitor resources etc. They also providing integrated caching, so your site load so fast.

Their standard Cloud planstarts from $6.95/m, which includes 2 core CPU, 2 GB ram, 100 GB SSD storage, 1 free domain, unmetered bandwidth, 500 MB email storage.

Features:

Price: $6.95/m

DreamHost is WordPress recommended hosting and it is powering over 1.5 million websites since 1995. Their hosting platforms are specially optimized for WordPress and they also offer a different kind of hosting options. If you are a beginner then you can start with their shared hosting which offers a free domain name registration.

No matter if you are a blogger or a developer, their managed cloud hosting is a great option. Unlike other hosting providers, you can launch your cloud hosting within 30 seconds. They dont limit anything to your hosting and you can run anyapplication on their high-performance SSD cloud servers. If you are thinking to migrate your site or project to their server, they will take care of this. Their hassle-free cloud servers allow you to control everything on your need.

Another thing we like about DreamHost that their cloud hosting price is much cheaper than other hosting company. Their starter cloud hosting plan starts from only $4.50/month and their all accounts come with 100 GB block storage and free bandwidth.

Features:

Price: $4.50/m

A2Hosting is the fastest cloud hosting provider. However, their cloud hosting price is a bit expensive, but you will get the full benefit from it.

If you search for A2Hosting reviews on Google, you will see this hosting is the most rated preferred hosting than others. They provide almost every hosting features that a business site needs to be running without any hassle. If you are using a hosting and want to switch to a high-performance hosting environment, then A2Hosting is the perfect solution for you.

Their basic plan starts from $15/month which offers 10Gbps redundant network, ultra-fast hardware, high availability failure, Dedicated IPs, full root access and much more. Unlike other cloud hosting providers, you can configure your own customized cloud hosting which means you only pay what you use.

Features:

Liquidweb provides highly managed cloud hosting that is suitable for all kinds of websites and brands. High traffic sites frequently get traffic spikes which cause downtime, but Liquidweb can easily manage heavy traffic. Whether your site gets 10 hits or 10 million hits, theirweb server clusters continue to serve your traffic with no downtime.

Liquidweb also simplified their website management system with the simple dashboard. From the dashboard, you can create your website in just a few seconds. With theirCloud Hosting, you dont have to setup and configure a server, all are managed by them. If you have one or multi-website, no problem, they allowyou to host an unlimited number of websites without extra charge.

Their price starting from $59/m and offers 1 TB monthly bandwidth, 50 GB SSD Storage, simple dashboard and CloudFlare CDN.

Features:

Price: $59/m

Vultr is one of the best high-performance Cloud Hosting Providers and a great alternative of DigitalOcean. It can both manage light and heavy traffic. Their all servers are built with powerful infrastructure, SSD drives, lightning fast network and they have total 15 Data Centers over the world.

From their admin panel, you can deploy CentOS, Debian, Ubuntu, Windowsetc Operating System. You will be provided full Root access to control all stuff and a dedicated IP address included with all VMs. You can integrate with their API to deploy, destroy and control your instance.

They offer hourly and monthly billing system, so you can pay per hour or month. Their Cloud hosting planstarts from $2.50/mo which includes 1 core CPU, 512 MB memory, 500 GB bandwidth.

Features:

Price: $2.50/m

SiteGround offers fully managed Cloud Hosting. If you have small development project or want to host a high traffic site, then SiteGround as the best option. With their every plan, you will getWHM & cPanel, 1 Dedicated IP,IP Tables Firewall, Free SSL certificated, SSH Access etc.

Their starting plan starts from $80/m which offers 2 CPU cores, 4 GB memory, 40 GB SSD Space and 5 TB Data Transfer. They use lightweight Linux containers for their managed servers and you can add additional resources to your server any time to auto-scale upon traffic spikes without rebooting.From their cPanel, you can launch WordPress or any other popular CMS with just one click. They also offer daily backups, free CDN, SuperCacher, Staging etc.

From their cPanel, you can launch WordPress or any other popular CMS with just one click. They also offer daily backups, free CDN, SuperCacher, Staging etc.Being a VIP user, you will get blazing fast support by real experts andless than 10 min replies on tickets.

Features:

Price: $80/m

Get SiteGround Cloud Hosting

In this article, we have shown a compact list of 7 Best Managed Cloud Hosting Providers of 2018. I hope this helps you to find the best Cloud Hosting Provider. Do you have any suggestion, please let us know by leaving a comment. You can also find us onFacebook, Twitter, Google+.

Related Articles,

See more here:
7 Best Managed Cloud Hosting Providers of 2018 - WPMyWeb

Read More..

Cloud Hosting vs. Shared Hosting Comparison | HostGator

To have a site live on the Internet youre going to need web hosting. But, how do you choose the right kind of hosting for your business?And what is the difference between cloud hosting vs. shared hosting?

Web hosting can be complex and there are a lot of options. Its easy to get lost, so dont feel bad if youre confused at the starting point.

Below we dive into the differences between cloud hosting and shared hosting. These are two of the most common hosting choices. Remember, there is no right choice for everyone; the right type for you depends upon the needs of your website.

Lets jump in!

Shared web hosting is the cheapest, most popular and most widely available type of hosting solution. Youll usually see it advertised for $9.99 or less a month.

With shared hosting a single server is divided up between multiple users. Each user will get a shared amount of bandwidth. However, each user can also put an unlimited amount of sites on their account. So the server thats being split up between multiple accounts can sometimes end up hosting thousands of sites!

This means that your site could perform poorly if another site on the server is taking up too many resources. However, web hosts usually do their best to mitigate these effects.

Cloud hosting is better for larger sites, while shared hosting can be a great choice for those with a very tight budget or for people who are planning on keeping their sites very small.

Curious about cloud hosting? Cloud hosting, or cloud VPS hosting, allows you to use the resources of multiple servers, rather than having your site confined to a single server location. This makes cloud services highly sought after.

The main benefit of cloud hosting is that it allows for unlimited expansion and is a must-have for heavy traffic sites. Cloud hosting also allows for greater protection from an overwhelmed server. If one server is overwhelmed, youll simply be switched to another cloud server.

Cloud hosting is widely seen as a better option to shared hosting because of its ability to handle large amounts of traffic, its improved security protection, and its reliability.

However, these extras do come at a cost, and most cloud hosting options are more expensive than shared hosting plans. But if youre planning on growing your site and you need a site with a high performance rate, the advantages then cloud hosting will probably be the best option for your needs.

That being said, if youre just getting started and have a very small or nonexistent budget, a shared hosting plan may be a good option for you then you until you have the cash to upgrade.

What hosting plan do you have? Let us know in the comments below.

Related

More here:
Cloud Hosting vs. Shared Hosting Comparison | HostGator

Read More..

Microsofts quantum computing network takes a giant leap …

Microsoft is focusing on the development of quantum computers that take advantage of cryogenically cooled nanowires. (Microsoft Photo)

REDMOND, Wash. Quantum computing may still be in its infancy but the Microsoft Quantum Network is all grown up, fostered by in-house developers, research affiliates and future stars of the startup world.

The network made its official debut today here at Microsofts Redmond campus, during a Startup Summit that laid out the companys vision for quantum computing and introduced network partners to Microsofts tools of the quantum trade.

Quantum computing stands in contrast to the classical computer technologies that have held sway for more than a half-century. Classical computing is based on the ones and zeroes of bit-based processing, while quantum computing takes advantage of the weird effects of quantum physics. Quantum bits, or qubits, neednt represent a one or a zero, but can represent multiple states during computation.

The quantum approach should be able to solve computational problems that cant easily be solved using classical computers, such as modeling molecular interactions or optimizing large-scale systems. That could open the way to world-changing applications, said Todd Holmdahl, corporate vice president of Microsofts Azure Hardware Systems Group.

Were looking at problems like climate change, Holmdahl said. Were looking at solving big food production problems. We think we have opportunities to solve problems around materials science, personal health care, machine learning. All of these things are possible and obtainable with a quantum computer. We have been talking around here that were at the advent of the quantum economy.

Representatives from 16 startups were invited to this weeks Startup Summit, which features talks from Holmdahl and other leaders of Microsofts quantum team as well as demos and workshops focusing on Microsofts programming tools. (The closest startup to Seattle is 1QBit, based in Vancouver, B.C.)

Over the past year and a half, Microsoft has released a new quantum-friendly programming language called Q# (Q-sharp) as part of its Quantum Development Kit, and has worked with researchers at Pacific Northwest National Laboratory and academic institutions around the world to lay the technical groundwork for the field.

A big part of that groundwork is the development ofa universal quantum computer, based on a topological architecture that builds error-correcting mechanisms right into the cryogenically cooled, nanowire-based hardware. Cutting down on the error-producing noise in quantum systems will be key to producing a workable computer.

We believe that our qubit equals about 1,000 of our competitions qubits, Holmdahl said.

Theres lots of competition in the quantum computing field nowadays: IBM, Google and Intel are all working on similar technologies for a universal quantum computer, while Canadas D-Wave Systems is taking advantage of a more limited type of computing technology known as quantum annealing.

This week, D-Wave previewed its plans for a new type of computer topology that it said would reduce quantum noise and more than double the qubit count of its existing platform, from 2,000 linked qubits to 5,000.

But the power of quantum computing shouldnt be measured merely by counting qubits. The efficiency of computation and the ability to reduce errors can make a big difference, said Microsoft principal researcher Matthias Troyer.

For example, a standard approach to simulating the molecular mechanism behind nitrogen fixation for crops could require 30,000 years of processing time, he said. But if the task is structured to enable parallel processing and enhanced error correction, the required runtime can be shrunk to less than two days.

Quantum software engineering is really as important as the hardware engineering, Troyer said.

Julie Love, director of Microsoft Quantum Business Development, said that Microsoft will start out offering quantum computing through Miicrosofts Azure cloud-based services. Not all computational problems are amenable to the quantum approach: Its much more likely that an application will switch between classical and quantum processing and therefore, between classical tools such as the C# programming language and quantum tools such as Q#.

When you work in chemistry and materials, all of these problems, you hit this known to be unsolvable problem, Love said. Quantum provides the possibility of a breakthrough.

Love shies away from giving a firm timetable for the emergence of specific applications but last year, Holmdahl predicted that commercial quantum computers would exist five years from now. (Check back in 2023 to see how the prediction panned out.)

The first applications could well focus on simulating molecular chemistry, with the aim of prototyping better pharmaceuticals, more efficient fertilizers, better batteries, more environmentally friendly chemicals for the oil and gas industry, and a new class of high-temperature superconductors. It might even be possible to address the climate change challenge by custom-designing materials that pull excess carbon dioxide out of the air.

Love said quantum computers would also be well-suited for addressing optimization problems, like figuring out how to make traffic flow better through Seattles urban core; and for reducing the training time required for AI modeling.

That list is going to continue to evolve, she said.

Whenever the subject quantum computing comes up, cryptography has to be mentioned as well. Its theoretically possible for a quantum computer to break the codes that currently protect all sorts of secure transactions, ranging from email encryption to banking protocols.

Love said those code-breaking applications are farther out than other likely applications, due to the huge amount of computation resources that would be required even for a quantum computer. Nevertheless, its not too early to be concerned. We have a pretty significant research thrust in whats called post-quantum crypto, she said.

Next-generation data security is one of the hot topics addressed $1.2 billion National Quantum Initiative that was approved by Congress and the White House last December. Love said Microsofts post-quantum crypto protocols have already gone through an initial round of vetting by the National Institute of Standards and Technology.

Weve been working at this in a really open way, she said.

Like every technology, quantum computing is sure to have a dark side as well as a bright side. But its reassuring to know that developers are thinking ahead about both sides.

Read this article:
Microsofts quantum computing network takes a giant leap ...

Read More..

Best Cloud Storage Reviews 2019 | 31 Best Services Compared

Cloud storage services are so popular that some have hundreds of millions of users. They can store your content in the cloud and sync it across your devices but you can use them as more than that. They let you preview and share files, collaborate with your teammates, play music and videos and more.

Not only will uploading your files make it easier to collaborate with your team, but it will free space on your hard drive, too. Plus, in the event that your hard drive malfunctions or gets stolen, your files will still be in the cloud. Documents, designs, reports or even bigger files, such as 3D models and movies, are all good candidates for a trip to the cloud.

Many of the services can function as a network drive which means that any content you place in it will be stored only in the cloud and available for preview on your computer. You wont be able to access it when offline, but it wont take up space on your hard drive, either.

If your focus is on collaboration, read our best cloud storage for collaboration article. If youre looking for backup, check out our list of the best cloud backup services. To be clear on the difference between storage and backup, read this article.

If youre not sure how much space you need and you lean towards the a lot side read our guide for large files article.

Cloud storage services can be many things to many people, but for the purposes of this comparison, we took into account the criteria they all share and rated them accordingly. Were going to outline them in this section.

File sharing and syncing features are the foundation of a cloud storage service. Youll more than likely going to use them first because syncing will get your files to the cloud while sharing files will help you, well, share them with others. Most of the services use the common model of sync developed by Dropbox in 2007. Read more about it in our Dropbox review.

The common model of sync consists of a system tray icon and a sync folder. The system tray icon is your go-to way of accessing your cloud storage. In most cases it opens up your sync folder, has a link which opens the cloud storage web client and the settings menu. The sync folder shows all the content that youve synced to your computer.

Many services let you share files using the desktop client, but all let you share it using the web client. You can share content to social media networks, via email, by copying and pasting a link or inviting users to your folders.

Folder invites can have different permissions attached to them while links have their own content control options such as passwords, expiry dates, download limits and more.

In this category, we check if a service integrates with third-party apps which include collaboration staples such as Microsoft Office Online, Google Office Suite, Trello, Asana and more. Many of the popular services have their own apps such as tasks managers, workflow and note-taking apps. Music and video players are also common.

Dropbox has Paper, Box has Box Notes and OneDrive has OneNote. You can see our comparison of them in our best note-taking apps article. Weve compared the popular Google Docs vs Dropbox Paper, too.

If you love photos and videos read our guide for photos and videos article (no, Google Photos is not the only service you can use) to see which service will help you stream your multimedia from the cloud. We didnt forget the music lovers who can consult our guide for music piece.

iOS and Android are the most popular mobile operating systems and most services offer an app for these platform. However, having and app doesnt mean its good. So we test all mobile apps to see if they can match their desktop counterparts.

Placing your content in the cloud without proper security is a bad idea. Criminals might steal your credentials, someone might read your secret information and the government might be browsing through your photos. That said, having good security is a must so we consider how strong the security of each service is.

Cloud services use many methods to secure your data against potential threats. Two-factor authentication will stop hackers whove stolen your password from accessing your account. Still, you should make sure you have a strong password, to begin with.

The TLS protocol prevents man-in-the-middle attacks from succeeding, while encryption secures your data in transit and at rest. Private encryption prevents anyone other than you from reading your files. The drawback is that services which provide it wont be able to reset your password if you forget it. To avoid losing access to your content use a password manager.

Ransomware can take your cloud data hostage and demand payment for its release, but our ransomware protection article can help you avoid it.

We considered all these features and how services implement them in our most secure cloud storage article.

Your security might be sufficient, but that doesnt mean it guarantees your privacy. Its no secret that governments spy on their citizens, thanks to laws such as the USA PATRIOT Act and CLOUD Act. The PRISM surveillance program in the U.S. is one example of that. With those in play, its paramount that you ensure the privacy of your information on the web.

That said, services differ in their approach to privacy. Google Drive and Dropbox have been connected to the PRISM project while Google scans your content and email to give you targeted advertising. You can find out more about Googles approach in our Google Drive review.

How strong your privacy is, also depends on which country the service is based. Some countries, such as Canada and Switzerland, take more care of users privacy rights than others.

We always like to see a service which adheres to the General Data Protection Regulation, EUs iron-clad approach to cloud privacy. Many do, but those that stand out are Sync.com, Tresorit and MEGA.

One of the most important factors when choosing a cloud storage service is the price, theres no point if you have a great service that costs an arm and a leg. Its best if it has premium plans with good value.

Good value is determined by how much you get for the price. The more plans a service has, the better your options will be. Its great if the service offers a free plan or trial, too, so you can test it before committing. If youre only concerned with good value, read our best deals in cloud storage.

Most of the services offer a trial or a free plan. Free plans range from Dropboxs meager 2GB to Google Drives generous 15GB that you can also use for Google Photos. You can get more information about free deals in our best free cloud storage piece.

Like the previous category, ease of use is important because no matter how many great features a service has, it wont do it much good if users find it difficult to use.

Because of that, straightforward user experience is better than one thats complex, outdated and requires you to get help from an IT genius. Cloud storage services should work on most operating systems, as well as have attractive and intuitive interfaces.

Most desktop clients work on Windows and macOS, but we give bonus points to those that work on Linux. You can see which are those on our best online storage for Linux piece. If you work on Windows, consult our guide for Windows

MEGA and OneDrive stand out in this category so read our MEGA review and OneDrive review to learn more how they deal with privacy.

Fast speeds depend on how close you are to a server and your internet service provider. We looked for services that let you tweak transfer settings to improve your connection and use a block-level transfer algorithm. It speeds up the process of updating files that have already been uploaded by only sending the parts that have changed.

To grade services, we perform several upload and download tests using a 1GB folder and measure how much time they take. We take into account our distance to a server, too.

Nobody wants to encounter a problem using a cloud storage service, but once they do, good technical support is priceless. Services usually have email support, while some offer chat or even phone support.

Before contacting the support team, though, users can consult the FAQ, knowledgebase and in many cases, user forums. Often, these will have tutorial videos, too. We take into account what types of support the services offer and how long they take to respond to our questions.

Some cloud storage services are better at one thing than another, but for the purposes of this ranking, we considered their overall performance across the criteria we outlined in the previous section. This section will give you a quick recap of the top five services on our list starting with the champion, Sync.com.

Sync.com is a service known for its strong security and privacy. It helps that its based in Canada which has strong privacy laws. Its one of the best zero-knowledge cloud storage services. The drawback of that is that it doesnt integrate with third-party apps.

Its also at the top of our list for sharing thanks to its sharing and content control options, which protect your shared files and folders. They include password protection, private end-to-end encryption, expiry dates, download limits and more. To share content you can create links or team folders.

If youre thinking Sync.com must be expensive as hell because it provides all this, youd be wrong. It has some of the most competitive prices on the market. Its cheapest plan, Pro Personal 500GB, is only $49 per year. The best deal, though, is its 2TB plan at $96 a year. Read our Sync.com review for more pricing details.

Right on the heels of Sync.com, comes pCloud. Its not number one because it requires you to pay extra for private encryption, but it still has many redeeming qualities.

First, its our top choice for playing music, videos and previewing photos. Plus, its the best cloud storage service for Kodi. pCloud is WebDAV compatible, too. pCloud has another trick up its sleeve and thats cloud to cloud backup. You can use it to backup content from Facebook, Instagram, Google Drive, Dropbox and OneDrive to pCloud.

Sharing and content control features work well, too. You can share folders and specific files via links. pCloud lets you share folders by inviting others and granting can edit or can view permissions. Alternatively, you can generate an upload link which others can use to upload directly to your folder or a download link that enables them to download your files.

pCloud has great value, as well. It even undercuts Sync.com by a slight margin. Its Premium 500GB plan is only $47.88 per year while its best-valued plan, Premium 2TB, is $95.88 per year. To find out more about pClouds offers read our pCloud review.

MEGA advertises as the privacy company and thats justified considering it extends the GDPRs rules to all of its users, not just those based in the EU. Its zero-knowledge encryption helps protect your privacy, too.

It has one of the best user experiences on the market. The desktop app works on Windows, macOS and Linux while the web client is intuitive and attractive. You can easily upload your files by dragging and dropping them anywhere in the client window. To edit content you can right-click or use the three dots. You can move files simply by dragging them to a new location.

MEGAs web app lets you share your files by generating a link that you can protect with a key. That makes them zero-knowledge, so only people you give the key to can read them. You can attach the key with your link, which means anyone with the link can access it, or send the key separately.

You can share a folder by creating a link or inviting others via their email address. If you do that, you can set one of several permissions. To invite others to upload to your folder, including non-MEGA users, you can turn it into a MEGAdrop folder.

Tresorit is one of the most secure cloud storage services and one that has great privacy, too. Unlike Sync.com, that comes at a steep price, though.

It has zero-knowledge encryption that protects files at rest, the TLS protocol which secures them in transit, two-factor authentication which helps protect your credentials and more. Its privacy policy is clear and easy to understand and navigate. It also adheres to the GDPR.

Its sharing capabilities dont lack, either. You can share content with specific individuals via email or by generating a link and copying and pasting it. You can share a folder by generating a link or inviting users via email. If you use an email to share a folder, the recipients will need to register for a Tresorit account. Files can only be shared with a link and dont have that requirement.

You can protect links using a password, expiry date or download limit. Folders have three levels of permissions. They are manager, which can share, edit and view; editor, which can read and modify; and view, which can only read. You can find more about Tresorits sharing capabilities in our Tresorit review.

OneDrive is Microsofts entry in the cloud storage market. It has a sketchy past regarding privacy, including being connected to the PRISM project. OneDrive has updated its security since, but it cant match the most secure cloud storage services.

Its user experience is among the best, though. The web client is attractive and intuitive to use. You can easily upload files by dragging and dropping. You can perform actions using right-click or by selecting content and then choosing the action from the menu.

Microsoft OneDrive integrates with Office Online which you can use to collaborate with others no matter the plan you subscribe to. If you want to take notes and share them you can use OneNote. To communicate with others, theres Skype which is integrated with the web client. Productivity apps include Forms for workflow management and Sway for content publishing.

You can find out more about Microsoft OneDrives features and its pricing plans in our OneDrive review.

Below weve put some of the most often asked questions by our readers, plus the answers.

Its not the fluffy, white things in the sky. Instead, the term refers to software and services that run on the internet instead of locally on your computer. That software is stored in data centers which hold many servers. You can access it using a web browser or a dedicated desktop or mobile app.

Services such as Microsoft Azure, Amazon S3 and Wasabi are examples of the server infrastructure that acts as a service and helps you create your own cloud. You can find more options in our best cloud IaaS article.

The shorts answer is: it depends. It depends on what youre going to be using it for. If youre going to collaborate on Office documents, presentations and share notes you dont need much. 100GB should be more than enough.

You should focus on services that have great collaboration capabilities. Dropbox for Business is a great choice and you can find out more about it in our Dropbox for Business review. If its not your cup of tea Google Drive or Box are good alternatives. Read more about Boxs collaboration capabilities in our Box review.

If you plan to store photos you need more storage. pCloud is at the top of our providers for photos and it has a cheap 2TB plan, too. It can hold around 600,000 photos averaging 3MB in size. If youre a professional photographer you might prefer one of the services from our best cloud storage for photographers article.

Video requires even more storage, but, once again, pCloud comes to the rescue. Its web player works without a hitch and the 2TB plan will be able to accommodate your HD collection. For other options consult our guide for storing videos in the cloud.

If you dont know how much storage you need, and chances are high youre going to be needing more and more you should try one of the services which offer unlimited storage. You can find more about them on our best unlimited cloud storage providers list. Make sure to check our feature lists so youre aware of any file size restrictions or limitations.

Like with the internet in general, theres a client and a server. In this case, a client is a computer user subscribing to a cloud storage service, while the server is a machine in a data center.

A client syncs (sends) copies of files over the Internet to the data server, which then saves the information. When the client wishes to retrieve the information, he or she accesses the data server through a web, desktop or mobile client. The server then either sends the files back to the client or allows the client to access and manipulate the files on the server itself.

Cloud storage systems generally rely on hundreds of data servers (especially for unlimited storage providers). Computers can be unavailable at times because of crashes or maintenance so data is usually stored on multiple machines. This is called redundancy. Without redundancy, a cloud storage system couldnt ensure clients that they could access their information at any given time.

Cloud storage is a cloud computing model in which data is stored and maintained on remote servers accessed from the internet, or cloud.

According to our most secure cloud storage article, Sync.com is the most secure cloud storage provider. Egnyte Connect, one of the best EFSS solutions, is hot on its heels, though. Its expensive for solo consumers, but its a great choice for businesses that want strong security.

No matter what you need to use cloud storage services for, our ranking will help you choose the best one. We made it considering a number of factors including security, privacy, speed and more. Sync.com is our top choice because of its approach to security, privacy and its great value deals.

pCloud isnt far behind but its not committed to privacy as much as Sync.com and it requires you to pay extra for private encryption. MEGA is great for privacy but it cant quite match pClouds value nor its features. It has great ease of use, though. Tresorit is a security and privacy fortress but its expensive because of that.

Our last pick is OneDrive which cant quite match the security or privacy of others, but it does have strong collaboration features along with straightforward ease of use.

This article is an overview of the best providers so when you think youve found a service that works for you, we recommend that you consult its review to find out more it. Thank you for reading.

Go here to see the original:
Best Cloud Storage Reviews 2019 | 31 Best Services Compared

Read More..

Qubit – Wikipedia

In quantum computing, a qubit()or quantum bit(sometimes qbit) is the basic unit of quantum informationthe quantum version of the classical binary bit physically realized with a two-state device. A qubit is a two-state (or two-level) quantum-mechanical system, one of the simplest quantum systems displaying the peculiarity of quantum mechanics. Examples include: the spin of the electron in which the two levels can be taken as spin up and spin down; or the polarization of a single photon in which the two states can be taken to be the vertical polarization and the horizontal polarization. In a classical system, a bit would have to be in one state or the other. However, quantum mechanics allows the qubit to be in a coherent superpositionof both states/levels simultaneously, a property which is fundamental to quantum mechanics and quantum computing.

The coining of the term qubit is attributed to Benjamin Schumacher.[1] In the acknowledgments of his 1995 paper, Schumacher states that the term qubit was created in jest during a conversation with William Wootters. The paper describes a way of compressing states emitted by a quantum source of information so that they require fewer physical resources to store. This procedure is now known as Schumacher compression.

A binary digit, characterized as 0 and 1, is used to represent information in classical computers. A binary digit can represent up to one bit of Shannon information, where a bit is the basic unit of information.However, in this article, the word bit is synonymous with binary digit.

In classical computer technologies, a processed bit is implemented by one of two levels of low DC voltage, and whilst switching from one of these two levels to the other, a so-called forbidden zone must be passed as fast as possible, as electrical voltage cannot change from one level to another instantaneously.

There are two possible outcomes for the measurement of a qubitusually taken to have the value "0" and "1", like a bit or binary digit. However, whereas the state of a bit can only be either 0 or 1, the general state of a qubit according to quantum mechanics can be a coherent superpositionof both.[2] Moreover, whereas a measurement of a classical bit would not disturb its state, a measurement of a qubit would destroy its coherence and irrevocably disturb the superposition state. It is possible to fully encode one bit in one qubit. However, a qubit can hold more information, e.g. up to two bits using superdense coding.

For a system of n components, a complete description of its state in classical physics requires only n bits, whereas in quantum physics it requires 2n1 complex numbers.[3]

In quantum mechanics, the general quantum state of a qubit can be represented by a linear superposition of its two orthonormal basis states (or basis vectors). These vectors are usually denoted as | 0 = [ 1 0 ] {displaystyle |0rangle ={bigl [}{begin{smallmatrix}1\0end{smallmatrix}}{bigr ]}} and | 1 = [ 0 1 ] {displaystyle |1rangle ={bigl [}{begin{smallmatrix}0\1end{smallmatrix}}{bigr ]}} . They are written in the conventional Diracor "braket"notation; the | 0 {displaystyle |0rangle } and | 1 {displaystyle |1rangle } are pronounced "ket 0" and "ket 1", respectively. These two orthonormal basis states, { | 0 , | 1 } {displaystyle {|0rangle ,|1rangle }} , together called the computational basis, are said to span the two-dimensional linear vector (Hilbert) space of the qubit.

Qubit basis states can also be combined to form product basis states. For example, two qubits could be represented in a four-dimensional linear vector space spanned by the following product basis states: | 00 = [ 1 0 0 0 ] {displaystyle |00rangle ={biggl [}{begin{smallmatrix}1\0\0\0end{smallmatrix}}{biggr ]}} , | 01 = [ 0 1 0 0 ] {displaystyle |01rangle ={biggl [}{begin{smallmatrix}0\1\0\0end{smallmatrix}}{biggr ]}} , | 10 = [ 0 0 1 0 ] {displaystyle |10rangle ={biggl [}{begin{smallmatrix}0\0\1\0end{smallmatrix}}{biggr ]}} , and | 11 = [ 0 0 0 1 ] {displaystyle |11rangle ={biggl [}{begin{smallmatrix}0\0\0\1end{smallmatrix}}{biggr ]}} .

In general, n qubits are represented by a superposition state vector in 2n-dimensional Hilbert space.

A pure qubit state is a coherent superposition of the basis states. This means that a single qubit can be described by a linear combination of | 0 {displaystyle |0rangle } and | 1 {displaystyle |1rangle } :

where and are probability amplitudes and can in general both be complex numbers.When we measure this qubit in the standard basis, according to the Born rule, the probability of outcome | 0 {displaystyle |0rangle } with value "0" is | | 2 {displaystyle |alpha |^{2}} and the probability of outcome | 1 {displaystyle |1rangle } with value "1" is | | 2 {displaystyle |beta |^{2}} . Because the absolute squares of the amplitudes equate to probabilities, it follows that {displaystyle alpha } and {displaystyle beta } must be constrained by the equation

Note that a qubit in this superposition state does not have a value in between "0" and "1"; rather, when measured, the qubit has a probability | | 2 {displaystyle |alpha |^{2}} of the value 0 and a probability | | 2 {displaystyle |beta |^{2}} of the value "1". In other words, superposition means that there is no way, even in principle, to tell which of the two possible states forming the superposition state actually pertains. Furthermore, the probability amplitudes, {displaystyle alpha } and {displaystyle beta } , encode more than just the probabilities of the outcomes of a measurement; the relative phase of {displaystyle alpha } and {displaystyle beta } is responsible for quantum interference, e.g., as seen in the two-slit experiment.

It might, at first sight, seem that there should be four degrees of freedom in | = | 0 + | 1 {displaystyle |psi rangle =alpha |0rangle +beta |1rangle ,} , as {displaystyle alpha } and {displaystyle beta } are complex numbers with two degrees of freedom each. However, one degree of freedom is removed by the normalization constraint ||2 + ||2 = 1. This means, with a suitable change of coordinates, one can eliminate one of the degrees of freedom. One possible choice is that of Hopf coordinates:

Additionally, for a single qubit the overall phase of the state ei has no physically observable consequences, so we can arbitrarily choose to be real (or in the case that is zero), leaving just two degrees of freedom:

where e i {displaystyle e^{iphi }} is the physically significant relative phase.

The possible quantum states for a single qubit can be visualised using a Bloch sphere (see diagram). Represented on such a 2-sphere, a classical bit could only be at the "North Pole" or the "South Pole", in the locations where | 0 {displaystyle |0rangle } and | 1 {displaystyle |1rangle } are respectively. This particular choice of the polar axis is arbitrary, however. The rest of the surface of the Bloch sphere is inaccessible to a classical bit, but a pure qubit state can be represented by any point on the surface. For example, the pure qubit state ( ( | 0 + i | 1 ) / 2 ) {displaystyle ((|0rangle +i|1rangle )/{sqrt {2}})} would lie on the equator of the sphere at the positive y axis. In the classical limit, a qubit, which can have quantum states anywhere on the Bloch sphere, reduces to the classical bit, which can be found only at either poles.

The surface of the Bloch sphere is a two-dimensional space, which represents the state space of the pure qubit states. This state space has two local degrees of freedom, which can be represented by the two angles {displaystyle phi } and {displaystyle theta } .

A pure state is one fully specified by a single ket, | = | 0 + | 1 , {displaystyle |psi rangle =alpha |0rangle +beta |1rangle ,,} a coherent superposition as described above. Coherence is essential for a qubit to be in a superposition state. With interactions and decoherence, it is possible to put the qubit in a mixed state, a statistical combination or incoherent mixture of different pure states. Mixed states can be represented by points inside the Bloch sphere (or in the Bloch ball). A mixed qubit state has three degrees of freedom: the angles {displaystyle phi } and {displaystyle theta } , as well as the length r {displaystyle r} of the vector that represents the mixed state.

There are various kinds of physical operations that can be performed on pure qubit states.

An important distinguishing feature between qubits and classical bits is that multiple qubits can exhibit quantum entanglement. Quantum entanglement is a nonlocal property of two or more qubits that allows a set of qubits to express higher correlation than is possible in classical systems.

The simplest system to display quantum entanglement is the system of two qubits. Consider, for example, two entangled qubits in the | + {displaystyle |Phi ^{+}rangle } Bell state:

In this state, called an equal superposition, there are equal probabilities of measuring either product state | 00 {displaystyle |00rangle } or | 11 {displaystyle |11rangle } , as | 1 / 2 | 2 = 1 / 2 {displaystyle |1/{sqrt {2}}|^{2}=1/2} . In other words, there is no way to tell if the first qubit has value 0 or 1 and likewise for the second qubit.

Imagine that these two entangled qubits are separated, with one each given to Alice and Bob. Alice makes a measurement of her qubit, obtainingwith equal probabilitieseither | 0 {displaystyle |0rangle } or | 1 {displaystyle |1rangle } , i.e., she can not tell if her qubit has value 0 or 1. Because of the qubits' entanglement, Bob must now get exactly the same measurement as Alice. For example, if she measures a | 0 {displaystyle |0rangle } , Bob must measure the same, as | 00 {displaystyle |00rangle } is the only state where Alice's qubit is a | 0 {displaystyle |0rangle } . In short, for these two entangled qubits, whatever Alice measures, so would Bob, with perfect correlation, in any basis, however far apart they may be and even though both can not tell if their qubit has value 0 or 1 a most surprising circumstance that can not be explained by classical physics.

Controlled gates act on 2 or more qubits, where one or more qubits act as a control for some specified operation. In particular, the controlled NOT gate (or CNOT or cX) acts on 2 qubits, and performs the NOT operation on the second qubit only when the first qubit is | 1 {displaystyle |1rangle } , and otherwise leaves it unchanged. With respect to the unentangled product basis { | 00 {displaystyle {|00rangle } , | 01 {displaystyle |01rangle } , | 10 {displaystyle |10rangle } , | 11 } {displaystyle |11rangle }} , it maps the basis states as follows:

A common application of the CNOT gate is to maximally entangle two qubits into the | + {displaystyle |Phi ^{+}rangle } Bell state. To construct | + {displaystyle |Phi ^{+}rangle } , the inputs A (control) and B (target) to the CNOT gate are:

1 2 ( | 0 + | 1 ) A {displaystyle {frac {1}{sqrt {2}}}(|0rangle +|1rangle )_{A}} and | 0 B {displaystyle |0rangle _{B}}

After applying CNOT, the output is the | + {displaystyle |Phi ^{+}rangle } Bell State: 1 2 ( | 00 + | 11 ) {displaystyle {frac {1}{sqrt {2}}}(|00rangle +|11rangle )} .

The | + {displaystyle |Phi ^{+}rangle } Bell state forms part of the setup of the superdense coding, quantum teleportation, and entangled quantum cryptography algorithms.

Quantum entanglement also allows multiple states (such as the Bell state mentioned above) to be acted on simultaneously, unlike classical bits that can only have one value at a time. Entanglement is a necessary ingredient of any quantum computation that cannot be done efficiently on a classical computer. Many of the successes of quantum computation and communication, such as quantum teleportation and superdense coding, make use of entanglement, suggesting that entanglement is a resource that is unique to quantum computation.[4] A major hurdle facing quantum computing, as of 2018, in its quest to surpass classical digital computing, is noise in quantum gates that limits the size of quantum circuits that can be executed reliably.[5]

A number of qubits taken together is a qubit register. Quantum computers perform calculations by manipulating qubits within a register. A qubyte (quantum byte) is a collection of eight qubits.[6]

Similar to the qubit, the qutrit is the unit of quantum information that can be realized in suitable 3-level quantum systems. This is analogous to the unit of classical information trit of ternary computers. Note, however, that not all 3-level quantum systems are qutrits.[7] The term "qu-d-it" (quantum d-git) denotes the unit of quantum information that can be realized in suitable d-level quantum systems.[8]

Any two-level quantum-mechanical system can be used as a qubit. Multilevel systems can be used as well, if they possess two states that can be effectively decoupled from the rest (e.g., ground state and first excited state of a nonlinear oscillator). There are various proposals. Several physical implementations that approximate two-level systems to various degrees were successfully realized. Similarly to a classical bit where the state of a transistor in a processor, the magnetization of a surface in a hard disk and the presence of current in a cable can all be used to represent bits in the same computer, an eventual quantum computer is likely to use various combinations of qubits in its design.

The following is an incomplete list of physical implementations of qubits, and the choices of basis are by convention only.

In a paper entitled "Solid-state quantum memory using the 31P nuclear spin", published in the October 23, 2008, issue of the journal Nature,[9] a team of scientists from the U.K. and U.S. reported the first relatively long (1.75 seconds) and coherent transfer of a superposition state in an electron spin "processing" qubit to a nuclear spin "memory" qubit. This event can be considered the first relatively consistent quantum data storage, a vital step towards the development of quantum computing. Recently, a modification of similar systems (using charged rather than neutral donors) has dramatically extended this time, to 3 hours at very low temperatures and 39 minutes at room temperature.[10] Room temperature preparation of a qubit based on electron spins instead of nuclear spin was also demonstrated by a team of scientists from Switzerland and Australia.[11]

Here is the original post:
Qubit - Wikipedia

Read More..

When Will Quantum Computing Have Real Commercial Value …

Photo:IBM Research Workers assemble the enclosure for the IBM Q System One quantum computer, which was shown at the Consumer Electronics Show in Las Vegas in January.

Our romance with new technologies always seems to follow the same trajectory: We are by turns mesmerized and adoring, disappointed and disheartened, and end up settling for less than we originally imagined. In 1954, Texas Instruments touted its new transistors as bringing electronic brains approaching the human brain in scope and reliability much closer to reality. In 2000, U.S. president Bill Clinton declared that the Human Genome Project would lead to a world in which our childrens children will know the term cancer only as a constellation of stars. And so it is now with quantumcomputing.

The popular press is awash with articles touting its promise. Tech giants are pouring huge amounts of money into building prototypes. You get the distinct impression that the computer industry is on the verge of an imminent quantum revolution.

But not everyone believes that quantum computing is going to solve real-world problems in anything like the time frame that some proponents of the technology want us to believe. Indeed, many of the researchers involved acknowledge the hype has gotten out of control, cautioning that quantum computing may take decades to mature.

Theoretical physicist Mikhail Dyakonov, a researcher for many years at Ioffe Institute, in Saint Petersburg, Russia, and now at the University of Montpellier, in France, is even more skeptical. In The Case Against Quantum Computing, he lays out his view that practical general-purpose quantum computers will not be built anytime in the foreseeable future.

As you might expect, his essay ruffled some feathers after it was published online. But as it turns out, while his article was being prepared, a committee assembled by the U.S. National Academies of Sciences, Engineering, and Medicine had been grappling with the very same question.

The committee was to provide an independent assessment of the feasibility and implications of creating a functional quantum computer capable of addressing real-world problems.... It was to estimate the time and resources required, and how to assess the probability of success.

The experts who took up the challenge included John Martinis of the University of California, Santa Barbara, who heads Googles quantum-hardware efforts; David Awschalom of the University of Chicago, who formerly directed the Center for Spintronics and Quantum Computation at UCSB; and Umesh Vazirani of the University of California, Berkeley, who codirects the Berkeley Quantum Information and Computation Center.

To their credit, in their report, released in December, they didnt sugarcoat the difficulties. Quite the opposite.

The committee concluded that it is highly unexpected that anyone will be able to build a quantum computer that could compromise public-key cryptosystems (a task that quantum computers are, in theory, especially suitable for tackling) in the coming decade. And while less-capable noisy intermediate-scale quantum computers will be built within that time frame, there are at present no known algorithms/applications that could make effective use of this class of machine, the committee says.

Okay, if not a decade, then how long? The committee was not prepared to commit itself to any estimate. Authors of a commentary in the January issue of the Proceedings of IEEE devoted to quantum computing were similarly reticent to make concrete predictions. So the answer is: Nobody really knows.

The people working in this area are nevertheless thrilled by recent progress theyve made on proof-of-concept devices and by the promise of this research. They no doubt consider the technical hurdles to be much more tractable than Dyakonov concludes. So dont be surprised when you see their perspectives appear in Spectrum, too.

This article appears in the March 2019 print issue as Quantum Computings Prospects.

Read the rest here:
When Will Quantum Computing Have Real Commercial Value ...

Read More..

The Case Against Quantum Computing – IEEE Spectrum

Illustration: Christian Gralingen

Quantum computing is all the rage. It seems like hardly a day goes by without some news outlet describing the extraordinary things this technology promises. Most commentators forget, or just gloss over, the fact that people have been working on quantum computing for decadesand without any practical results to show for it.

Weve been told that quantum computers could provide breakthroughs in many disciplines, including materials and drug discovery, the optimization of complexsystems, and artificial intelligence. Weve been assured that quantum computers will forever alter our economic, industrial, academic, and societal landscape. Weve even been told that the encryption that protects the worlds most sensitive data may soon be broken by quantum computers. It has gotten to the point where many researchers in various fields of physics feel obliged to justify whatever work they are doing by claiming that it has some relevance to quantum computing.

Meanwhile, government research agencies, academic departments (many of them funded by government agencies), and corporate laboratories are spending billions of dollars a year developing quantum computers. On Wall Street, Morgan Stanley and other financial giants expect quantum computing to mature soon and are keen to figure out how this technology can help them.

Its become something of a self-perpetuating arms race, with many organizations seemingly staying in the race if only to avoid being left behind. Some of the worlds top technical talent, at places like Google, IBM, and Microsoft, are working hard, and with lavish resources in state-of-the-art laboratories, to realize their vision of a quantum-computing future.

In light of all this, its natural to wonder: When will useful quantum computers be constructed? The most optimistic experts estimate it will take 5to 10 years. More cautious ones predict 20 to 30 years. (Similar predictions have been voiced, by the way, for the last 20 years.) I belong to a tiny minority that answers, Not in the foreseeable future. Having spent decades conducting research in quantum and condensed-matter physics, Ive developed my very pessimistic view. Its based on an understanding of the gargantuan technical challenges that would have to be overcome to ever make quantum computing work.

The idea of quantum computing first appeared nearly 40 years ago, in 1980, when the Russian-born mathematician Yuri Manin, who now works at the Max Planck Institute for Mathematics, in Bonn, first put forward the notion, albeit in a rather vague form. The concept really got on the map, though, the following year, when physicist Richard Feynman, at the California Institute of Technology, independently proposed it.

Realizing that computer simulations of quantum systems become impossible to carry out when the system under scrutiny gets too complicated, Feynman advanced the idea that the computer itself should operate in the quantum mode: Nature isnt classical, dammit, and if you want to make a simulation of nature, youd better make it quantum mechanical, and by golly its a wonderful problem, because it doesnt look so easy, he opined. A few years later, University of Oxford physicist David Deutsch formally described a general-purpose quantum computer, a quantum analogue of the universal Turing machine.

The subject did not attract much attention, though, until 1994, when mathematician Peter Shor (then at Bell Laboratories and now at MIT) proposed an algorithm for an ideal quantum computer that would allow very large numbers to be factored much faster than could be done on a conventional computer. This outstanding theoretical result triggered an explosion of interest in quantum computing. Many thousands of research papers, mostly theoretical, have since been published on the subject, and they continue to come out at an increasing rate.

The basic idea of quantum computing is to store and process information in a way that is very different from what is done in conventional computers, which are based on classical physics. Boiling down the many details, its fair to say that conventional computers operate by manipulating a large number of tiny transistors working essentially as on-off switches, which change state between cycles of the computers clock.

The state of the classical computer at the start of any given clock cycle can therefore be described by a long sequence of bits corresponding physically to the states of individual transistors. With N transistors, there are 2N possible states for the computer to be in. Computation on such a machine fundamentally consists of switching some of its transistors between their on and off states, according to a prescribed program.

In quantum computing, the classical two-state circuit element (the transistor) is replaced by a quantum element called a quantum bit, or qubit. Like the conventional bit, it also has two basic states. Although a variety of physical objects could reasonably serve as quantum bits, the simplest thing to use is the electrons internal angular momentum, or spin, which has the peculiar quantum property of having only two possible projections on any coordinate axis: +1/2 or 1/2 (in units of the Planck constant). For whatever the chosen axis, you can denote the two basic quantum states of the electrons spin as and .

Heres where things get weird. With the quantum bit, those two states arent the only ones possible. Thats because the spin state of an electron is described by a quantum-mechanical wave function. And that function involves two complex numbers, and (called quantum amplitudes), which, being complex numbers, have real parts and imaginary parts. Those complex numbers, and , each have a certain magnitude, and according to the rules of quantum mechanics, their squared magnitudes must add up to 1.

Thats because those two squared magnitudes correspond to the probabilities for the spin of the electron to be in the basic states and when you measure it. And because those are the only outcomes possible, the two associated probabilities must add up to 1. For example, if the probability of finding the electron in the state is 0.6 (60percent), then the probability of finding it in the state must be 0.4 (40 percent)nothing else would make sense.

In contrast to a classical bit, which can only be in one of its two basic states, a qubit can be in any of a continuum of possible states, as defined by the values of the quantum amplitudes and . This property is often described by the rather mystical and intimidating statement that a qubit can exist simultaneously in both of its and states.

Yes, quantum mechanics often defies intuition. But this concept shouldnt be couched in such perplexing language. Instead, think of a vector positioned in the x-y plane and canted at 45degrees to the x-axis. Somebody might say that this vector simultaneously points in both the x- and y-directions. That statement is true in some sense, but its not really a useful description. Describing a qubit as being simultaneously in both and states is, in my view, similarly unhelpful. And yet, its become almost de rigueur for journalists to describe it as such.

In a system with two qubits, there are 22 or 4 basic states, which can be written (), (), (), and (). Naturally enough, the two qubits can be described by a quantum-mechanical wave function that involves four complex numbers. In the general case of N qubits, the state of the system is described by 2N complex numbers, which are restricted by the condition that their squared magnitudes must all add up to 1.

While a conventional computer with N bits at any given moment must be in one of its 2N possible states, the state of a quantum computer with N qubits is described by the values of the 2N quantum amplitudes, which are continuous parameters (ones that can take on any value, not just a 0 or a 1). This is the origin of the supposed power of the quantum computer, but it is also the reason for its great fragility and vulnerability.

How is information processed in such a machine? Thats done by applying certain kinds of transformationsdubbed quantum gatesthat change these parameters in a precise and controlled manner.

Experts estimate that the number of qubits needed for a useful quantum computer, one that could compete with your laptop in solving certain kinds of interesting problems, is between 1,000 and 100,000. So the number of continuous parameters describing the state of such a useful quantum computer at any given moment must be at least 21,000, which is to say about 10300. Thats a very big number indeed. How big? It is much, much greater than the number of subatomic particles in the observable universe.

To repeat: A useful quantum computer needs to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe.

At this point in a description of a possible future technology, a hardheaded engineer loses interest. But lets continue. In any real-world computer, you have to consider the effects of errors. In a conventional computer, those arise when one or more transistors are switched off when they are supposed to be switched on, or vice versa. This unwanted occurrence can be dealt with using relatively simple error-correction methods, which make use of some level of redundancy built into the hardware.

In contrast, its absolutely unimaginable how to keep errors under control for the 10300 continuous parameters that must be processed by a useful quantum computer. Yet quantum-computing theorists have succeeded in convincing the general public that this is feasible. Indeed, they claim that something called the threshold theorem proves it can be done. They point out that once the error per qubit per quantum gate is below a certain value, indefinitely long quantum computation becomes possible, at a cost of substantially increasing the number of qubits needed. With those extra qubits, they argue, you can handle errors by forming logical qubits using multiple physical qubits.

How many physical qubits would be required for each logical qubit? No one really knows, but estimates typically range from about 1,000 to 100,000. So the upshot is that a useful quantum computer now needs a million or more qubits. And the number of continuous parameters defining the state of this hypothetical quantum-computing machinewhich was already more than astronomical with 1,000 qubitsnow becomes even more ludicrous.

Even without considering these impossibly large numbers, its sobering that no one has yet figured out how to combine many physical qubits into a smaller number of logical qubits that can compute something useful. And its not like this hasnt long been a key goal.

In the early 2000s, at the request of the Advanced Research and Development Activity (a funding agency of the U.S. intelligence community that is now part of Intelligence Advanced Research Projects Activity), a team of distinguished experts in quantum information established a road map for quantum computing. It had a goal for 2012 that requires on the order of 50 physical qubits and exercises multiple logical qubits through the full range of operations required for fault-tolerant [quantum computation] in order to perform a simple instance of a relevant quantum algorithm. Its now the end of 2018, and that ability has still not been demonstrated.

The huge amount of scholarly literature thats been generated about quantum-computing is notably light on experimental studies describing actual hardware. The relatively few experiments that have been reported were extremely difficult to conduct, though, and must command respect and admiration.

The goal of such proof-of-principle experiments is to show the possibility of carrying out basic quantum operations and to demonstrate some elements of the quantum algorithms that have been devised. The number of qubits used for them is below 10, usually from 3 to 5. Apparently, going from 5 qubits to 50 (the goal set by the ARDA Experts Panel for the year 2012) presents experimental difficulties that are hard to overcome. Most probably they are related to the simple fact that 25 = 32, while 250 = 1,125,899,906,842,624.

By contrast, the theory of quantum computing does not appear to meet any substantial difficulties in dealing with millions of qubits. In studies of error rates, for example, various noise models are being considered. It has been proved (under certain assumptions) that errors generated by local noise can be corrected by carefully designed and very ingenious methods, involving, among other tricks, massive parallelism, with many thousands of gates applied simultaneously to different pairs of qubits and many thousands of measurements done simultaneously, too.

A decade and a half ago, ARDAs Experts Panel noted that it has been established, under certain assumptions, that if a threshold precision per gate operation could be achieved, quantum error correction would allow a quantum computer to compute indefinitely. Here, the key words are under certain assumptions. That panel of distinguished experts did not, however, address the question of whether these assumptions could ever be satisfied.

I argue that they cant. In the physical world, continuous quantities (be they voltages or the parameters defining quantum-mechanical wave functions) can be neither measured nor manipulated exactly. That is, no continuously variable quantity can be made to have an exact value, including zero. To a mathematician, this might sound absurd, but this is the unquestionable reality of the world we live in, as any engineer knows.

Sure, discrete quantities, like the number of students in a classroom or the number of transistors in the on state, can be known exactly. Not so for quantities that vary continuously. And this fact accounts for the great difference between a conventional digital computer and the hypothetical quantum computer.

Indeed, all of the assumptions that theorists make about the preparation of qubits into a given state, the operation of the quantum gates, the reliability of the measurements, and so forth, cannot be fulfilled exactly. They can only be approached with some limited precision. So, the real question is: What precision is required? With what exactitude must, say, the square root of 2 (an irrational number that enters into many of the relevant quantum operations) be experimentally realized? Should it be approximated as 1.41 or as 1.41421356237? Or is even more precision needed?There are no clear answers to these crucial questions.

While various strategies for building quantum computers are now being explored, an approach that many people consider the most promising, initially undertaken by the Canadian company D-Wave Systems and now being pursued by IBM, Google, Microsoft, and others, is based on using quantum systems of interconnected Josephson junctions cooled to very low temperatures (down to about 10 millikelvins).

The ultimate goal is to create a universal quantum computer, one that can beat conventional computers in factoring large numbers using Shors algorithm, performing database searches by a similarly famous quantum-computing algorithm that Lov Grover developed at Bell Laboratories in 1996, and other specialized applications that are suitable for quantum computers.

On the hardware front, advanced research is under way, with a 49-qubit chip (Intel), a 50-qubit chip (IBM), and a 72-qubit chip (Google) having recently been fabricated and studied. The eventual outcome of this activity is not entirely clear, especially because these companies have not revealed the details of their work.

While I believe that such experimental research is beneficial and may lead to a better understanding of complicated quantum systems, Im skeptical that these efforts will ever result in a practical quantum computer. Such a computer would have to be able to manipulateon a microscopic level and with enormous precisiona physical system characterized by an unimaginably huge set of parameters, each of which can take on a continuous range of values. Could we ever learn to control the more than 10300 continuously variable parameters defining the quantum state of such a system?

My answer is simple. No, never.

I believe that, appearances to the contrary, the quantum computing fervor is nearing its end. Thats because a few decades is the maximum lifetime of any big bubble in technology or science. After a certain period, too many unfulfilled promises have been made, and anyone who has been following the topic starts to get annoyed by further announcements of impending breakthroughs. Whats more, by that time all the tenured faculty positions in the field are already occupied. The proponents have grown older and less zealous, while the younger generation seeks something completely new and more likely to succeed.

All these problems, as well as a few others Ive not mentioned here, raise serious doubts about the future of quantum computing. There is a tremendous gap between the rudimentary but very hard experiments that have been carried out with a few qubits and the extremely developed quantum-computing theory, which relies on manipulating thousands to millions of qubits to calculate anything useful. That gap is not likely to be closed anytime soon.

To my mind, quantum-computing researchers should still heed an admonition that IBM physicist Rolf Landauer made decades ago when the field heated up for the first time. He urged proponents of quantum computing to include in their publications a disclaimer along these lines: This scheme, like all other schemes for quantum computation, relies on speculative technology, does not in its current form take into account all possible sources of noise, unreliability and manufacturing error, and probably will not work.

Editors note: A sentence in this article originally stated that concerns over required precision were never even discussed. This sentence was changed on 30 November 2018 after some readers pointed out to the author instances in the literature that had considered these issues. The amended sentence now reads: There are no clear answers to these crucial questions.

Mikhail Dyakonov does research in theoretical physics at Charles Coulomb Laboratory at the University of Montpellier, in France. His name is attached to various physical phenomena, perhaps most famously Dyakonov surface waves.

Go here to read the rest:
The Case Against Quantum Computing - IEEE Spectrum

Read More..

Cloud – Wikipedia

In meteorology, a cloud is an aerosol consisting of a visible mass of minute liquid droplets, frozen crystals, or other particles suspended in the atmosphere of a planetary body.[1] Water or various other chemicals may compose the droplets and crystals. On Earth, clouds are formed as a result of saturation of the air when it is cooled to its dew point, or when it gains sufficient moisture (usually in the form of water vapor) from an adjacent source to raise the dew point to the ambient temperature. They are seen in the Earth's homosphere (which includes the troposphere, stratosphere, and mesosphere). Nephology is the science of clouds, which is undertaken in the cloud physics branch of meteorology.

There are two methods of naming clouds in their respective layers of the atmosphere; Latin and common. Cloud types in the troposphere, the atmospheric layer closest to Earth's surface, have Latin names due to the universal adaptation of Luke Howard's nomenclature. Formally proposed in 1802, it became the basis of a modern international system that divides clouds into five physical forms that appear in any or all of three altitude levels (formerly known as tages). These physical types, in approximate ascending order of convective activity, include stratiform sheets, cirriform wisps and patches, stratocumuliform layers (mainly structured as rolls, ripples, and patches), cumuliform heaps, and very large cumulonimbiform heaps that often show complex structure. The physical forms are divided by altitude level into ten basic genus-types. The Latin names for applicable high-level genera carry a cirro- prefix, and an alto- prefix is added to the names of the mid-level genus-types. Most of the genera can be subdivided into species and further subdivided into varieties. A very low stratiform cloud that extends down to the Earth's surface is given the common name, fog, but has no Latin name.

Two cirriform clouds that form higher up in the stratosphere and mesosphere have common names for their main types. They are seen infrequently, mostly in the polar regions of Earth. Clouds have been observed in the atmospheres of other planets and moons in the Solar System and beyond. However, due to their different temperature characteristics, they are often composed of other substances such as methane, ammonia, and sulfuric acid as well as water.

Taken as a whole, homospheric clouds can be cross-classified by form and level to derive the ten tropospheric genera, the fog that forms at surface level, and the two additional major types above the troposphere. The cumulus genus includes three species that indicate vertical size. Clouds with sufficient vertical extent to occupy more than one altitude level are officially classified as low- or mid-level according to the altitude range at which each initially forms. However they are also more informally classified as multi-level or vertical.

The origin of the term cloud can be found in the old English clud or clod, meaning a hill or a mass of rock. Around the beginning of the 13th century, the word came to be used as a metaphor for rain clouds, because of the similarity in appearance between a mass of rock and cumulus heap cloud. Over time, the metaphoric usage of the word supplanted the old English weolcan, which had been the literal term for clouds in general.[2][3]

Ancient cloud studies were not made in isolation, but were observed in combination with other weather elements and even other natural sciences. In about 340 BC the Greek philosopher Aristotle wrote Meteorologica, a work which represented the sum of knowledge of the time about natural science, including weather and climate. For the first time, precipitation and the clouds from which precipitation fell were called meteors, which originate from the Greek word meteoros, meaning 'high in the sky'. From that word came the modern term meteorology, the study of clouds and weather. Meteorologica was based on intuition and simple observation, but not on what is now considered the scientific method. Nevertheless, it was the first known work that attempted to treat a broad range of meteorological topics.[4]

After centuries of speculative theories about the formation and behavior of clouds, the first truly scientific studies were undertaken by Luke Howard in England and Jean-Baptiste Lamarck in France. Howard was a methodical observer with a strong grounding in the Latin language and used his background to classify the various tropospheric cloud types during 1802. He believed that the changing cloud forms in the sky could unlock the key to weather forecasting. Lamarck had worked independently on cloud classification the same year and had come up with a different naming scheme that failed to make an impression even in his home country of France because it used unusual French names for cloud types. His system of nomenclature included twelve categories of clouds, with such names as (translated from French) hazy clouds, dappled clouds and broom-like clouds. By contrast, Howard used universally accepted Latin, which caught on quickly after it was published in 1803.[5] As a sign of the popularity of the naming scheme, the German dramatist and poet Johann Wolfgang von Goethe composed four poems about clouds, dedicating them to Howard. An elaboration of Howard's system was eventually formally adopted by the International Meteorological Conference in 1891.[5] This system covered only the tropospheric cloud types, but the discovery of clouds above the troposphere during the late 19th century eventually led to the creation separate classification schemes for these very high clouds.[6]

Terrestrial clouds can be found throughout most of the homosphere, which includes the troposphere, stratosphere, and mesosphere. Within these layers of the atmosphere, air can become saturated as a result of being cooled to its dew point or by having moisture added from an adjacent source.[7] In the latter case, saturation occurs when the dew point is raised to the ambient air temperature.

Adiabatic cooling occurs when one or more of three possible lifting agents cyclonic/frontal, convective, or orographic causes a parcel of air containing invisible water vapor to rise and cool to its dew point, the temperature at which the air becomes saturated. The main mechanism behind this process is adiabatic cooling.[8] As the air is cooled to its dew point and becomes saturated, water vapor normally condenses to form cloud drops. This condensation normally occurs on cloud condensation nuclei such as salt or dust particles that are small enough to be held aloft by normal circulation of the air.[9][10]

Frontal and cyclonic lift occur when stable air is forced aloft at weather fronts and around centers of low pressure by a process called convergence.[11] Warm fronts associated with extratropical cyclones tend to generate mostly cirriform and stratiform clouds over a wide area unless the approaching warm airmass is unstable, in which case cumulus congestus or cumulonimbus clouds will usually be embedded in the main precipitating cloud layer.[12] Cold fronts are usually faster moving and generate a narrower line of clouds which are mostly stratocumuliform, cumuliform, or cumulonimbiform depending on the stability of the warm air mass just ahead of the front.[13]

Another agent is the convective upward motion of air caused by daytime solar heating at surface level.[9] Airmass instability allows for the formation of cumuliform clouds that can produce showers if the air is sufficiently moist.[14] On moderately rare occasions, convective lift can be powerful enough to penetrate the tropopause and push the cloud top into the stratosphere.[15]

A third source of lift is wind circulation forcing air over a physical barrier such as a mountain (orographic lift).[9] If the air is generally stable, nothing more than lenticular cap clouds will form. However, if the air becomes sufficiently moist and unstable, orographic showers or thunderstorms may appear.[16]

Along with adiabatic cooling that requires a lifting agent, there are three major non-adiabatic mechanisms for lowering the temperature of the air to its dew point. Conductive, radiational, and evaporative cooling require no lifting mechanism and can cause condensation at surface level resulting in the formation of fog.[17][18][19]

There are several main sources of water vapor that can be added to the air as a way of achieving saturation without any cooling process: water or moist ground,[20][21][22] precipitation or virga,[23] and transpiration from plants[24]

Tropospheric classification is based on a hierarchy of categories with physical forms and altitude levels at the top.[25][26] These are cross-classified into a total of ten genus types, most of which can be divided into species and further subdivided into varieties which are at the bottom of the hierarchy.[27]

Clouds in the troposphere assume five physical forms based on structure and process of formation. These forms are commonly used for the purpose of satellite analysis.[25] They are given below in approximate ascending order of instability or convective activity.[28]

Non-convective stratiform clouds appear in stable airmass conditions and, in general, have flat sheet-like structures that can form at any altitude in the troposphere.[29] The stratiform group is divided by altitude range into the genera cirrostratus (high-level), altostratus (mid-level), stratus (low-level), and nimbostratus (multi-level).[26] Fog is commonly considered a surface-based cloud layer.[16] The fog may form at surface level in clear air or it may be the result of a very low stratus cloud subsiding to ground or sea level. Conversely, low stratiform cloud results when advection fog is lifted above surface level during breezy conditions.

Cirriform clouds in the troposphere are of the genus cirrus and have the appearance of detached or semi-merged filaments. They form at high tropospheric altitudes in air that is mostly stable with little or no convective activity, although denser patches may occasionally show buildups caused by limited high-level convection where the air is partly unstable.[30] Clouds resembling cirrus can be found above the troposphere but are classified separately using common names.

Clouds of this structure have both cumuliform and stratiform characteristics in the form of rolls, ripples, or elements.[31] They generally form as a result of limited convection in an otherwise mostly stable airmass topped by an inversion layer.[32] If the inversion layer is absent or higher in the troposphere, increased air mass instability may cause the cloud layers to develop tops in the form of turrets consisting of embedded cumuliform buildups.[33] The stratocumuliform group is divided into cirrocumulus (high-level), altocumulus (mid-level), and stratocumulus (low-level).[31]

Cumuliform clouds generally appear in isolated heaps or tufts.[34][35] They are the product of localized but generally free-convective lift where there are no inversion layers in the troposphere to limit vertical growth. In general, small cumuliform clouds tend to indicate comparatively weak instability. Larger cumuliform types are a sign of greater atmospheric instability and convective activity.[36] Depending on their vertical size, clouds of the cumulus genus type may be low-level or multi-level with moderate to towering vertical extent.[26]

The largest free-convective clouds comprise the genus cumulonimbus which have towering vertical extent. They occur in highly unstable air[9] and often have fuzzy outlines at the upper parts of the clouds that sometimes include anvil tops.[31] These clouds are the product of very strong convection that can penetrate the lower stratosphere.

Tropospheric clouds form in any of three levels (formerly called tages) based on altitude range above the Earth's surface. The grouping of clouds into levels is commonly done for the purposes of cloud atlases, surface weather observations[26] and weather maps.[37] The base-height range for each level varies depending on the latitudinal geographical zone.[26] Each altitude level comprises two or three genus types differentiated mainly by physical form.[38][31]

The standard levels and genus-types are summarised below in approximate descending order of the altitude at which each is normally based.[39] Multi-level clouds with significant vertical extent are separately listed and summarized in approximate ascending order of instability or convective activity.[28]

High clouds form at altitudes of 3,000 to 7,600m (10,000 to 25,000ft) in the polar regions, 5,000 to 12,200m (16,500 to 40,000ft) in the temperate regions and 6,100 to 18,300m (20,000 to 60,000ft) in the tropics.[26] All cirriform clouds are classified as high and thus constitute a single genus cirrus (Ci). Stratocumuliform and stratiform clouds in the high altitude range carry the prefix cirro-, yielding the respective genus names cirrocumulus (Cc) and cirrostratus (Cs). When limited-resolution satellite images of high clouds are analysed without supporting data from direct human observations, it becomes impossible to distinguish between individual forms or genus types, which are then collectively identified as high-type (or informally as cirrus-type even though not all high clouds are of the cirrus form or genus).[40]

Non-vertical clouds in the middle level are prefixed by alto-, yielding the genus names altocumulus (Ac) for stratocumuliform types and altostratus (As) for stratiform types. These clouds can form as low as 2,000m (6,500ft) above surface at any latitude, but may be based as high as 4,000m (13,000ft) near the poles, 7,000m (23,000ft) at mid latitudes, and 7,600m (25,000ft) in the tropics.[26] As with high clouds, the main genus types are easily identified by the human eye, but it is not possible to distinguish between them using satellite photography. Without the support of human observations, these clouds are usually collectively identified as middle-type on satellite images.[40]

Low clouds are found from near surface up to 2,000m (6,500ft).[26] Genus types in this level either have no prefix or carry one that refers to a characteristic other than altitude. Clouds that form in the low level of the troposphere are generally of larger structure than those that form in the middle and high levels, so they can usually be identified by their forms and genus types using satellite photography alone.[40]

These clouds have low to middle level bases that form anywhere from near surface to about 2,400m (8,000ft) and tops that can extend into the high altitude range. Nimbostratus and some cumulus in this group usually achieve moderate or deep vertical extent, but without towering structure. However, with sufficient airmass instability, upward-growing cumuliform clouds can grow to high towering proportions. Although genus types with vertical extent are often informally considered a single group,[58] the International Civil Aviation Organization (ICAO) distinguishes towering vertical clouds more formally as a separate group or sub-group. It is specified that these very large cumuliform and cumulonimbiform types must be identified by their standard names or abbreviations in all aviation observations (METARS) and forecasts (TAFS) to warn pilots of possible severe weather and turbulence.[59] Multi-level clouds are of even larger structure than low clouds, and are therefore identifiable by their forms and genera, (and even species in the case of cumulus congestus) using satellite photography.[40]

This is a diffuse dark-grey non-convective stratiform layer with great horizontal extent and moderate to deep vertical development. It lacks towering structure and looks feebly illuminated from the inside.[60] Nimbostratus normally forms from mid-level altostratus, and develops at least moderate vertical extent[58][61] when the base subsides into the low level during precipitation that can reach moderate to heavy intensity. It commonly achieves deep vertical development when it simultaneously grows upward into the high level due to large scale frontal or cyclonic lift.[62] The nimbo- prefix refers to its ability to produce continuous rain or snow over a wide area, especially ahead of a warm front.[63] This thick cloud layer may be accompanied by embedded towering cumuliform or cumulonimbiform types.[61][64] Meteorologists affiliated with the World Meteorological Organization (WMO) officially classify nimbostratus as mid-level for synoptic purposes while informally characterizing it as multi-level.[26] Independent meteorologists and educators appear split between those who largely follow the WMO model[58][61] and those who classify nimbostratus as low-level, despite its considerable vertical extent and its usual initial formation in the middle altitude range.[65][66]

These clouds are sometimes classified separately from the other vertical or multi-level types because of their ability to produce severe turbulence.[59]

Genus types are commonly divided into subtypes called species that indicate specific structural details which can vary according to the stability and windshear characteristics of the atmosphere at any given time and location. Despite this hierarchy, a particular species may be a subtype of more than one genus, especially if the genera are of the same physical form and are differentiated from each other mainly by altitude or level. There are a few species, each of which can be associated with genera of more than one physical form.[72] The species types are grouped below according to the physical forms and genera with which each is normally associated. The forms, genera, and species are listed in approximate ascending order of instability or convective activity.[28]

Genus and species types are further subdivided into varieties whose names can appear after the species name to provide a fuller description of a cloud. Some cloud varieties are not restricted to a specific altitude level or form, and can therefore be common to more than one genus or species.[73]

Of the stratiform group, high-level cirrostratus comprises two species. Cirrostratus nebulosus has a rather diffuse appearance lacking in structural detail.[74] Cirrostratus fibratus is a species made of semi-merged filaments that are transitional to or from cirrus.[75] Mid-level altostratus and multi-level nimbostratus always have a flat or diffuse appearance and are therefore not subdivided into species. Low stratus is of the species nebulosus[74] except when broken up into ragged sheets of stratus fractus (see below).[58][72][76]

Cirriform clouds have three non-convective species that can form in mostly stable airmass conditions. Cirrus fibratus comprise filaments that may be straight, wavy, or occasionally twisted by non-convective wind shear.[75] The species uncinus is similar but has upturned hooks at the ends. Cirrus spissatus appear as opaque patches that can show light grey shading.[72]

Stratocumuliform genus-types (cirrocumulus, altocumulus, and stratocumulus) that appear in mostly stable air have two species each. The stratiformis species normally occur in extensive sheets or in smaller patches where there is only minimal convective activity.[77] Clouds of the lenticularis species tend to have lens-like shapes tapered at the ends. They are most commonly seen as orographic mountain-wave clouds, but can occur anywhere in the troposphere where there is strong wind shear combined with sufficient airmass stability to maintain a generally flat cloud structure. These two species can be found in the high, middle, or low level of the troposphere depending on the stratocumuliform genus or genera present at any given time.[58][72][76]

The species fractus shows variable instability because it can be a subdivision of genus-types of different physical forms that have different stability characteristics. This subtype can be in the form of ragged but mostly stable stratiform sheets (stratus fractus) or small ragged cumuliform heaps with somewhat greater instability (cumulus fractus).[72][76][78] When clouds of this species are associated with precipitating cloud systems of considerable vertical and sometimes horizontal extent, they are also classified as accessory clouds under the name pannus (see section on supplementary features).[79]

These species are subdivisions of genus types that can occur in partly unstable air. The species castellanus appears when a mostly stable stratocumuliform or cirriform layer becomes disturbed by localized areas of airmass instability, usually in the morning or afternoon. This results in the formation of cumuliform buildups arising from a common stratiform base.[80] Castellanus resembles the turrets of a castle when viewed from the side, and can be found with stratocumuliform genera at any tropospheric altitude level and with limited-convective patches of high-level cirrus.[81] Tufted clouds of the more detached floccus species are subdivisions of genus-types which may be cirriform or stratocumuliform in overall structure. They are sometimes seen with cirrus, cirrocumulus, altocumulus, and stratocumulus.[82]

A newly recognized species of stratocumulus or altocumulus has been given the name volutus, a roll cloud that can occur ahead of a cumulonimbus formation.[83] There are some volutus clouds that form as a consequence of interactions with specific geographical features rather than with a parent cloud. Perhaps the strangest geographically specific cloud of this type is the Morning Glory, a rolling cylindrical cloud that appears unpredictably over the Gulf of Carpentaria in Northern Australia. Associated with a powerful "ripple" in the atmosphere, the cloud may be "surfed" in glider aircraft.[84]

More general airmass instability in the troposphere tends to produce clouds of the more freely convective cumulus genus type, whose species are mainly indicators of degrees of atmospheric instability and resultant vertical development of the clouds. A cumulus cloud initially forms in the low level of the troposphere as a cloudlet of the species humilis that shows only slight vertical development. If the air becomes more unstable, the cloud tends to grow vertically into the species mediocris, then congestus, the tallest cumulus species[72] which is the same type that the International Civil Aviation Organization refers to as 'towering cumulus'.[59]

With highly unstable atmospheric conditions, large cumulus may continue to grow into cumulonimbus calvus (essentially a very tall congestus cloud that produces thunder), then ultimately into the species capillatus when supercooled water droplets at the top of the cloud turn into ice crystals giving it a cirriform appearance.[72][76]

All cloud varieties fall into one of two main groups. One group identifies the opacities of particular low and mid-level cloud structures and comprises the varieties translucidus (thin translucent), perlucidus (thick opaque with translucent or very small clear breaks), and opacus (thick opaque). These varieties are always identifiable for cloud genera and species with variable opacity. All three are associated with the stratiformis species of altocumulus and stratocumulus. However, only two varieties are seen with altostratus and stratus nebulosus whose uniform structures prevent the formation of a perlucidus variety. Opacity-based varieties are not applied to high clouds because they are always translucent, or in the case of cirrus spissatus, always opaque.[73][85]

A second group describes the occasional arrangements of cloud structures into particular patterns that are discernible by a surface-based observer (cloud fields usually being visible only from a significant altitude above the formations). These varieties are not always present with the genera and species with which they are otherwise associated, but only appear when atmospheric conditions favor their formation. Intortus and vertebratus varieties occur on occasion with cirrus fibratus. They are respectively filaments twisted into irregular shapes, and those that are arranged in fishbone patterns, usually by uneven wind currents that favor the formation of these varieties. The variety radiatus is associated with cloud rows of a particular type that appear to converge at the horizon. It is sometimes seen with the fibratus and uncinus species of cirrus, the stratiformis species of altocumulus and stratocumulus, the mediocris and sometimes humilis species of cumulus,[87][88] and with the genus altostratus.[89]

Another variety, duplicatus (closely spaced layers of the same type, one above the other), is sometimes found with cirrus of both the fibratus and uncinus species, and with altocumulus and stratocumulus of the species stratiformis and lenticularis. The variety undulatus (having a wavy undulating base) can occur with any clouds of the species stratiformis or lenticularis, and with altostratus. It is only rarely observed with stratus nebulosus. The variety lacunosus is caused by localized downdrafts that create circular holes in the form of a honeycomb or net. It is occasionally seen with cirrocumulus and altocumulus of the species stratiformis, castellanus, and floccus, and with stratocumulus of the species stratiformis and castellanus.[73][85]

It is possible for some species to show combined varieties at one time, especially if one variety is opacity-based and the other is pattern-based. An example of this would be a layer of altocumulus stratiformis arranged in seemingly converging rows separated by small breaks. The full technical name of a cloud in this configuration would be altocumulus stratiformis radiatus perlucidus, which would identify respectively its genus, species, and two combined varieties.[76][73][85]

Supplementary features and accessory clouds are not further subdivisions of cloud types below the species and variety level. Rather, they are either hydrometeors or special cloud types with their own Latin names that form in association with certain cloud genera, species, and varieties.[76][85] Supplementary features, whether in the form of clouds or precipitation, are directly attached to the main genus-cloud. Accessory clouds, by contrast, are generally detached from the main cloud.[90]

One group of supplementary features are not actual cloud formations, but precipitation that falls when water droplets or ice crystals that make up visible clouds have grown too heavy to remain aloft. Virga is a feature seen with clouds producing precipitation that evaporates before reaching the ground, these being of the genera cirrocumulus, altocumulus, altostratus, nimbostratus, stratocumulus, cumulus, and cumulonimbus.[90]

When the precipitation reaches the ground without completely evaporating, it is designated as the feature praecipitatio.[91] This normally occurs with altostratus opacus, which can produce widespread but usually light precipitation, and with thicker clouds that show significant vertical development. Of the latter, upward-growing cumulus mediocris produces only isolated light showers, while downward growing nimbostratus is capable of heavier, more extensive precipitation. Towering vertical clouds have the greatest ability to produce intense precipitation events, but these tend to be localized unless organized along fast-moving cold fronts. Showers of moderate to heavy intensity can fall from cumulus congestus clouds. Cumulonimbus, the largest of all cloud genera, has the capacity to produce very heavy showers. Low stratus clouds usually produce only light precipitation, but this always occurs as the feature praecipitatio due to the fact this cloud genus lies too close to the ground to allow for the formation of virga.[76][85][90]

Incus is the most type-specific supplementary feature, seen only with cumulonimbus of the species capillatus. A cumulonimbus incus cloud top is one that has spread out into a clear anvil shape as a result of rising air currents hitting the stability layer at the tropopause where the air no longer continues to get colder with increasing altitude.[92]

The mamma feature forms on the bases of clouds as downward-facing bubble-like protuberances caused by localized downdrafts within the cloud. It is also sometimes called mammatus, an earlier version of the term used before a standardization of Latin nomenclature brought about by the World Meterorological Organization during the 20th century. The best-known is cumulonimbus with mammatus, but the mamma feature is also seen occasionally with cirrus, cirrocumulus, altocumulus, altostratus, and stratocumulus.[90]

A tuba feature is a cloud column that may hang from the bottom of a cumulus or cumulonimbus. A newly formed or poorly organized column might be comparatively benign, but can quickly intensify into a funnel cloud or tornado.[90][93][94]

An arcus feature is a roll cloud with ragged edges attached to the lower front part of cumulus congestus or cumulonimbus that forms along the leading edge of a squall line or thunderstorm outflow.[95] A large arcus formation can have the appearance of a dark menacing arch.[90]

Several new supplementary features have been formally recognized by the World Meteorological Organization (WMO). The feature fluctus can form under conditions of strong atmospheric wind shear when a stratocumulus, altocumulus, or cirrus cloud breaks into regularly spaced crests. This variant is sometimes known informally as a KelvinHelmholtz (wave) cloud. This phenomenon has also been observed in cloud formations over other planets and even in the sun's atmosphere.[96] Another highly disturbed but more chaotic wave-like cloud feature associated with stratocumulus or altocumulus cloud has been given the Latin name asperitas. The supplementary feature cavum is a circular fall-streak hole that occasionally forms in a thin layer of supercooled altocumulus or cirrocumulus. Fall streaks consisting of virga or wisps of cirrus are usually seen beneath the hole as ice crystals fall out to a lower altitude. This type of hole is usually larger than typical lacunosus holes. A murus feature is a cumulonimbus wall cloud with a lowering, rotating cloud base than can lead to the development of tornadoes. A cauda feature is a tail cloud that extends horizontally away from the murus cloud and is the result of air feeding into the storm.[83]

Supplementary cloud formations detached from the main cloud are known as accessory clouds.[76][85][90] The heavier precipitating clouds, nimbostratus, towering cumulus (cumulus congestus), and cumulonimbus typically see the formation in precipitation of the pannus feature, low ragged clouds of the genera and species cumulus fractus or stratus fractus.[79]

A group of accessory clouds comprise formations that are associated mainly with upward-growing cumuliform and cumulonimbiform clouds of free convection. Pileus is a cap cloud that can form over a cumulonimbus or large cumulus cloud,[97] whereas a velum feature is a thin horizontal sheet that sometimes forms like an apron around the middle or in front of the parent cloud.[90] An accessory cloud recently officially recognized the World meteorological Organization is the flumen, also known more informally as the beaver's tail. It is formed by the warm, humid inflow of a super-cell thunderstorm, and can be mistaken for a tornado. Although the flumen can indicate a tornado risk, it is similar in appearance to pannus or scud clouds and does not rotate.[83]

Clouds initially form in clear air or become clouds when fog rises above surface level. The genus of a newly formed cloud is determined mainly by air mass characteristics such as stability and moisture content. If these characteristics change over time, the genus tends to change accordingly. When this happens, the original genus is called a mother cloud. If the mother cloud retains much of its original form after the appearance of the new genus, it is termed a genitus cloud. One example of this is stratocumulus cumulogenitus, a stratocumulus cloud formed by the partial spreading of a cumulus type when there is a loss of convective lift. If the mother cloud undergoes a complete change in genus, it is considered to be a mutatus cloud.[98]

The genitus and mutatus categories have been expanded to include certain types that do not originate from pre-existing clouds. The term flammagenitus (Latin for 'fire-made') applies to cumulus congestus or cumulonimbus that are formed by large scale fires or volcanic eruptions. Smaller low-level "pyrocumulus" or "fumulus" clouds formed by contained industrial activity are now classified as cumulus homogenitus (Latin for 'man-made'). Contrails formed from the exhaust of aircraft flying in the upper level of the troposphere can persist and spread into formations resembling any of the high cloud genus-types and are now officially designated as cirrus, cirrostratus, or cirrocumulus homogenitus. If a homogenitus cloud of one genus changes to another genus type, it is then termed a homomutatus cloud. Stratus cataractagenitus (Latin for 'cataract-made') are generated by the spray from waterfalls. Silvagenitus (Latin for 'forest-made') is a stratus cloud that forms as water vapor is added to the air above a forest canopy.[83]

Stratocumulus clouds can be organized into "fields" that take on certain specially classified shapes and characteristics. In general, these fields are more discernible from high altitudes than from ground level. They can often be found in the following forms:

These patterns are formed from a phenomenon known as a Krmn vortex which is named after the engineer and fluid dynamicist Theodore von Krmn,.[101] Wind driven clouds can form into parallel rows that follow the wind direction. When the wind and clouds encounter high elevation land features such as a vertically prominent islands, they can form eddies around the high land masses that give the clouds a twisted appearance.[102]

Although the local distribution of clouds can be significantly influenced by topography, the global prevalence of cloud cover in the troposphere tends to vary more by latitude. It is most prevalent in and along low pressure zones of surface tropospheric convergence which encircle the Earth close to the equator and near the 50th parallels of latitude in the northern and southern hemispheres.[105] The adiabatic cooling processes that lead to the creation of clouds by way of lifting agents are all associated with convergence; a process that involves the horizontal inflow and accumulation of air at a given location, as well as the rate at which this happens.[106] Near the equator, increased cloudiness is due to the presence of the low-pressure Intertropical Convergence Zone (ITCZ) where very warm and unstable air promotes mostly cumuliform and cumulonimbiform clouds.[107] Clouds of virtually any type can form along the mid-latitude convergence zones depending on the stability and moisture content of the air. These extratropical convergence zones are occupied by the polar fronts where air masses of polar origin meet and clash with those of tropical or subtropical origin.[108] This leads to the formation of weather-making extratropical cyclones composed of cloud systems that may be stable or unstable to varying degrees according to the stability characteristics of the various airmasses that are in conflict.[109]

Divergence is the opposite of convergence. In the Earth's troposphere, it involves the horizontal outflow of air from the upper part of a rising column of air, or from the lower part of a subsiding column often associated with an area or ridge of high pressure.[106] Cloudiness tends to be least prevalent near the poles and in the subtropics close to the 30th parallels, north and south. The latter are sometimes referred to as the horse latitudes. The presence of a large-scale high-pressure subtropical ridge on each side of the equator reduces cloudiness at these low latitudes.[110] Similar patterns also occur at higher latitudes in both hemispheres.[111]

The luminance or brightness of a cloud is determined by how light is reflected, scattered, and transmitted by the cloud's particles. Its brightness may also be affected by the presence of haze or photometeors such as halos and rainbows.[112] In the troposphere, dense, deep clouds exhibit a high reflectance (70% to 95%) throughout the visible spectrum. Tiny particles of water are densely packed and sunlight cannot penetrate far into the cloud before it is reflected out, giving a cloud its characteristic white color, especially when viewed from the top.[113] Cloud droplets tend to scatter light efficiently, so that the intensity of the solar radiation decreases with depth into the gases. As a result, the cloud base can vary from a very light to very-dark-grey depending on the cloud's thickness and how much light is being reflected or transmitted back to the observer. High thin tropospheric clouds reflect less light because of the comparatively low concentration of constituent ice crystals or supercooled water droplets which results in a slightly off-white appearance. However, a thick dense ice-crystal cloud appears brilliant white with pronounced grey shading because of its greater reflectivity.[112]

As a tropospheric cloud matures, the dense water droplets may combine to produce larger droplets. If the droplets become too large and heavy to be kept aloft by the air circulation, they will fall from the cloud as rain. By this process of accumulation, the space between droplets becomes increasingly larger, permitting light to penetrate farther into the cloud. If the cloud is sufficiently large and the droplets within are spaced far enough apart, a percentage of the light that enters the cloud is not reflected back out but is absorbed giving the cloud a darker look. A simple example of this is one's being able to see farther in heavy rain than in heavy fog. This process of reflection/absorption is what causes the range of cloud color from white to black.[114]

Striking cloud colorations can be seen at any altitude, with the color of a cloud usually being the same as the incident light.[115] During daytime when the sun is relatively high in the sky, tropospheric clouds generally appear bright white on top with varying shades of grey underneath. Thin clouds may look white or appear to have acquired the color of their environment or background. Red, orange, and pink clouds occur almost entirely at sunrise/sunset and are the result of the scattering of sunlight by the atmosphere. When the sun is just below the horizon, low-level clouds are gray, middle clouds appear rose-colored, and high clouds are white or off-white. Clouds at night are black or dark grey in a moonless sky, or whitish when illuminated by the moon. They may also reflect the colors of large fires, city lights, or auroras that might be present.[115]

A cumulonimbus cloud that appears to have a greenish or bluish tint is a sign that it contains extremely high amounts of water; hail or rain which scatter light in a way that gives the cloud a blue color. A green colorization occurs mostly late in the day when the sun is comparatively low in the sky and the incident sunlight has a reddish tinge that appears green when illuminating a very tall bluish cloud. Supercell type storms are more likely to be characterized by this but any storm can appear this way. Coloration such as this does not directly indicate that it is a severe thunderstorm, it only confirms its potential. Since a green/blue tint signifies copious amounts of water, a strong updraft to support it, high winds from the storm raining out, and wet hail; all elements that improve the chance for it to become severe, can all be inferred from this. In addition, the stronger the updraft is, the more likely the storm is to undergo tornadogenesis and to produce large hail and high winds.[116]

Yellowish clouds may be seen in the troposphere in the late spring through early fall months during forest fire season. The yellow color is due to the presence of pollutants in the smoke. Yellowish clouds are caused by the presence of nitrogen dioxide and are sometimes seen in urban areas with high air pollution levels.[117]

Stratocumulus stratiformis and small castellanus made orange by the sun rising

An occurrence of cloud iridescence with altocumulus volutus and cirrocumulus stratiformis

Sunset reflecting shades of pink onto grey stratocumulus stratiformis translucidus (becoming perlucidus in the background)

Stratocumulus stratiformis perlucidus before sunset. Bangalore, India.

Late-summer rainstorm in Denmark. Nearly black color of base indicates main cloud in foreground probably cumulonimbus.

Particles in the atmosphere and the sun's angle enhance colors of stratocumulus cumulogenitus at evening twilight

Clouds exert numerous influences on Earth's troposphere and climate. First and foremost, they are the source of precipitation, thereby greatly influencing the distribution and amount of precipitation. Because of their differential buoyancy relative to surrounding cloud-free air, clouds can be associated with vertical motions of the air that may be convective, frontal, or cyclonic. The motion is upward if the clouds are less dense because condensation of water vapor releases heat, warming the air and thereby decreasing its density. This can lead to downward motion because lifting of the air results in cooling that increases its density. All of these effects are subtly dependent on the vertical temperature and moisture structure of the atmosphere and result in major redistribution of heat that affect the Earth's climate.[118]

The complexity and diversity of clouds is a major reason for difficulty in quantifying the effects of clouds on climate and climate change. On the one hand, white cloud tops promote cooling of Earth's surface by reflecting shortwave radiation (visible and near infrared) from the sun, diminishing the amount of solar radiation that is absorbed at the surface, enhancing the Earth's albedo. Most of the sunlight that reaches the ground is absorbed, warming the surface, which emits radiation upward at longer, infrared, wavelengths. At these wavelengths, however, water in the clouds acts as an efficient absorber. The water reacts by radiating, also in the infrared, both upward and downward, and the downward longwave radiation results in increased warming at the surface. This is analogous to the greenhouse effect of greenhouse gases and water vapor.[118]

High-level genus-types particularly show this duality with both short-wave albedo cooling and long-wave greenhouse warming effects. On the whole, ice-crystal clouds in the upper troposphere (cirrus) tend to favor net warming.[119][120] However, the cooling effect is dominant with mid-level and low clouds, especially when they form in extensive sheets.[119] Measurements by NASA indicate that on the whole, the effects of low and mid-level clouds that tend to promote cooling outweigh the warming effects of high layers and the variable outcomes associated with vertically developed clouds.[119]

As difficult as it is to evaluate the influences of current clouds on current climate, it is even more problematic to predict changes in cloud patterns and properties in a future, warmer climate, and the resultant cloud influences on future climate. In a warmer climate more water would enter the atmosphere by evaporation at the surface; as clouds are formed from water vapor, cloudiness would be expected to increase. But in a warmer climate, higher temperatures would tend to evaporate clouds. Both of these statements are considered accurate, and both phenomena, known as cloud feedbacks, are found in climate model calculations. Broadly speaking, if clouds, especially low clouds, increase in a warmer climate, the resultant cooling effect leads to a negative feedback in climate response to increased greenhouse gases. But if low clouds decrease, or if high clouds increase, the feedback is positive. Differing amounts of these feedbacks are the principal reason for differences in climate sensitivities of current global climate models. As a consequence, much research has focused on the response of low and vertical clouds to a changing climate. Leading global models produce quite different results, however, with some showing increasing low clouds and others showing decreases.[121][122] For these reasons the role of tropospheric clouds in regulating weather and climate remains a leading source of uncertainty in global warming projections.[123][124]

Polar stratospheric clouds show little variation in structure and are limited to a single very high range of altitude of about 15,00025,000m (49,20082,000ft), so they are not classified into altitude levels, genus types, species, or varieties in the manner of tropospheric clouds.[6]

Polar stratospheric clouds form in the lowest part of the stratosphere during the winter, at the altitude and during the season that produces the coldest temperatures and therefore the best chances of triggering condensation caused by adiabatic cooling. They are typically very thin with an undulating cirriform appearance.[125] Moisture is scarce in the stratosphere, so nacreous and non-nacreous cloud at this altitude range is restricted to polar regions in the winter where the air is coldest.[6]

Polar mesospheric clouds form at a single extreme altitude range of about 80 to 85km (50 to 53mi) and are consequently not classified into more than one level. They are given the Latin name noctilucent because of their illumination well after sunset and before sunrise. They typically have a bluish or silvery white coloration that can resemble brightly illuminated cirrus. Noctilucent clouds may occasionally take on more of a red or orange hue.[126] They are not common or widespread enough to have a significant effect on climate.[127] However, an increasing frequency of occurrence of noctilucent clouds since the 19th century may be the result of climate change.[128]

Noctilucent clouds are the highest in the atmosphere and form near the top of the mesosphere at about ten times the altitude of tropospheric high clouds.[129] From ground level, they can occasionally be seen illuminated by the sun during deep twilight. Ongoing research indicates that convective lift in the mesosphere is strong enough during the polar summer to cause adiabatic cooling of small amount of water vapour to the point of saturation. This tends to produce the coldest temperatures in the entire atmosphere just below the mesopause. These conditions result in the best environment for the formation of polar mesospheric clouds.[127] There is also evidence that smoke particles from burnt-up meteors provide much of the condensation nuclei required for the formation of noctilucent cloud.[130]

Distribution in the mesosphere is similar to the stratosphere except at much higher altitudes. Because of the need for maximum cooling of the water vapor to produce noctilucent clouds, their distribution tends to be restricted to polar regions of Earth. A major seasonal difference is that convective lift from below the mesosphere pushes very scarce water vapor to higher colder altitudes required for cloud formation during the respective summer seasons in the northern and southern hemispheres. Sightings are rare more than 45 degrees south of the north pole or north of the south pole.[126]

Cloud cover has been seen on most other planets in the solar system. Venus's thick clouds are composed of sulfur dioxide (due to volcanic activity) and appear to be almost entirely stratiform.[131] They are arranged in three main layers at altitudes of 45 to 65km that obscure the planet's surface and can produce virga. No embedded cumuliform types have been identified, but broken stratocumuliform wave formations are sometimes seen in the top layer that reveal more continuous layer clouds underneath.[132] On Mars, noctilucent, cirrus, cirrocumulus and stratocumulus composed of water-ice have been detected mostly near the poles.[133][134] Water-ice fogs have also been detected on Mars.[135]

Both Jupiter and Saturn have an outer cirriform cloud deck composed of ammonia,[136][137] an intermediate stratiform haze-cloud layer made of ammonium hydrosulfide, and an inner deck of cumulus water clouds.[138][139] Embedded cumulonimbus are known to exist near the Great Red Spot on Jupiter.[140][141] The same category-types can be found covering Uranus, and Neptune, but are all composed of methane.[142][143][144][145][146][147] Saturn's moon Titan has cirrus clouds believed to be composed largely of methane.[148][149] The CassiniHuygens Saturn mission uncovered evidence of polar stratospheric clouds[150] and a methane cycle on Titan, including lakes near the poles and fluvial channels on the surface of the moon.[151]

Some planets outside the solar system are known to have atmospheric clouds. In October 2013, the detection of high altitude optically thick clouds in the atmosphere of exoplanet Kepler-7b was announced,[152][153] and, in December 2013, in the atmospheres of GJ 436 b and GJ 1214 b.[154][155][156][157]

Clouds play an important role in various cultures and religious traditions. The ancient Akkadians believed that the clouds were the breasts of the sky goddess Antu[159] and that rain was milk from her breasts.[159] In Exodus 13:21-22, Yahweh is described as guiding the Israelites through the desert in the form of a "pillar of cloud" by day and a "pillar of fire" by night.[158] In the ancient Greek comedy The Clouds, written by Aristophanes and first performed at the City Dionysia in 423 BC, the philosopher Socrates declares that the Clouds are the only true deities[160] and tells the main character Strepsiades not to worship any deities other than the Clouds, but to pay homage to them alone.[160] In the play, the Clouds change shape to reveal the true nature of whoever is looking at them,[161][160][162] turning into centaurs at the sight of a long-haired politician, wolves at the sight of the embezzler Simon, deer at the sight of the coward Cleonymus, and mortal women at the sight of the sight of the effeminate informer Cleisthenes.[161][162][160] They are hailed the source of inspiration to comic poets and philosophers;[160] they are masters of rhetoric, regarding eloquence and sophistry alike as their "friends".[160] In China, clouds are symbols of luck and happiness.[163] Overlapping clouds are thought to imply eternal happiness[163] and clouds of different colors are said to indicate "multiplied blessings".[163]

Read more:
Cloud - Wikipedia

Read More..