Page 4,464«..1020..4,4634,4644,4654,466..4,4704,480..»

What Are Bitcoin And Gold Saying About Paper Money – Seeking Alpha

This week, gold rose to a new short-term high and Bitcoin to a new all-time price peak. Gold is traditionally a safe-haven asset. In times of uncertainty and fear, gold tends to outperform other assets. Gold is also a traditional hedge against inflation that eats away at the value of many assets. Gold metal has a long history as a commodity and a means of exchange, a currency.

Over the course of history, gold has been around a lot longer than all of the currencies now traded in the foreign exchange market around the world. Gold is a commodity, and in the United States the Commodities Futures Trading Commission (CFTC) regulates the largest and most respected gold futures market in the world, the COMEX division of the Chicago Mercantile Exchange.

Meanwhile, Bitcoin is a cryptocurrency that is new on the financial scene. The Commodities Futures Trading Commission has defined Bitcoin as a commodity but it is much more than that. Bitcoin is a pan-global currency. Central banks, monetary authorities, or supranational financial institutions around the world do not control Bitcoin in any way, shape, or form.

So far in 2017, the prices of both gold and Bitcoin are moving higher. The bullish price action in these two assets could be telling us a lot about the value of paper money these days as well as the future for the status quo of foreign exchange markets.

Gold takes off in December, again

After a sharp and painful correction that took gold from over $1345 on November 9 to lows of $1127.20 on the active month COMEX April futures contract on December 15, the yellow metal shifted back into bullish mode.

Source: CQG

As the daily chart highlights, gold took off to the upside again after making lows in the middle of December and traded to a high of $1264.90 on Monday, February 27. The next level of technical resistance for the yellow metal is above $1300 per ounce. Gold has moved higher as fear and uncertainty in markets has caused a flight to quality assets and gold has a long history as a safe haven for investors.

The all-time nominal high for the price of gold came back in September 2011 when it traded to $1920.70. Gold has been making a statement about the faith in paper currencies since it initially rallied from the $1046.20 level in December 2015 and the trajectory of price is, in many ways, a commentary on faith in currencies and other asset prices these days. While gold has been shinning, another alternative currency has blown the roof off and traded to a new all-time high this week.

Bitcoin moves to a new all-time high

The price action in Bitcoin has been more bullish than in gold.

Source: CoinDesk

On March 1, the cryptocurrency traded to its highest level in history when the price hit over $1225 against the U.S. dollar, and by the time you read this piece, it is possible that Bitcoin is even higher.

The price action in both gold and Bitcoin has been bullish in 2017, which I interpret as an important event for the future value of world foreign exchange markets.

Paper currencies are losing value - backed by nothing but goodwill

Paper currencies around the world have the backing of the full faith and credit of the governments that print the dollars, euros, yen, Swiss francs, pound sterling, RMB, and many other world foreign exchange instruments. Gold and Bitcoin have appreciated against all of these currencies so far in 2017.

There are virtually no countries in the world today that back their monetary units with gold, silver or any other hard asset. While central banks, monetary authorities, and supranational financial institutions continue to hold gold as part of their foreign exchange reserves, the days of a gold standard ended decades ago.

The global financial crisis of 2008 and slowdown in Chinese economic growth over recent years has caused a tremendous amount of volatility in markets across all asset classes. Central banks have used monetary tools such as low short-term interest rates and quantitative easing to stimulate economic conditions. While many of these tools have avoided financial disaster by encouraging spending and borrowing and inhibiting savings, the trend in monetary policy and effects of massive liquidity has diluted the value of currencies to a point where faith in central banks and governments has been on the decline.

The value of a currency is a reflection of both economic and political conditions within the nation that prints legal tender. In China, a devaluation of the RMB has led many within the nation who have seen their wealth grow over recent years to seek more stable vehicles to preserve the value of their savings.

In Europe, Japan, and many other nations around the world, economic conditions remain lethargic. Only in the United States has the economy seen a turn of events with unemployment declining and GDP starting to show signs of growth. However, the new administration in the U.S. does not wish to see a runaway dollar when it comes to value against other currencies.

The administration wants the dollar lower

In 2014, the U.S. central bank began tapering off its quantitative easing program, and in late 2015, the short-term Fed Funds rate rose above zero for the first time since the financial crisis of 2008.

Source: CQG

As the monthly chart of the U.S. dollar index illustrates, the greenback took off against other world currencies in 2014 and rose from 79.83 to 100.38 in only ten short months. The over 27% appreciation of the dollar caused hardship for multinational U.S. companies, which found their products less competitive on world markets as a result of the rally in the dollar.

The dollar index stabilized and traded in a range from 92-100.60 during a twenty-month consolidation period that following the ten-month rally. However, after the election of Donald Trump as the forty-fifth President of the United States, the U.S. currency broke above technical resistance on the upside and rallied to the highest level since 2002 when it traded at 103.815 at the beginning of January 2017.

In the past, administrations in the United States followed a strong dollar policy. However, there are signs that the Trump Administration under Treasury Secretary Steve Mnuchin will not be advocates for a strong dollar at this time. Additionally, when the U.S. Federal Reserve released their monthly minutes of the latest FOMC meeting last week, one of the biggest concerns voiced by members of the body that determines short-term interest rates was that a strong dollar could weigh on economic growth.

The bottom line is that the dollar is strong against virtually all other currency instruments but the administration and central bank do not want to see the dollar continue to rise to new heights versus the world's other major currency instruments. Therefore, it is probable that the rallies in gold and Bitcoin are a reflection of a world where all paper currencies are losing value.

So many issues on the horizon favor both Bitcoin and gold

Currencies are a reflection of politics and economics. It was the financial crisis of 2008, an economic event that caused central banks to add liquidity to markets to avoid recessions or worse around the globe. However, today it seems that political forces have taken over and weigh on the value of monies printed by the governments of the world.

In China, the devaluation of the yuan and the non-convertibility of the currency for many Chinese, has led to an increase in the demand for pan-global monetary instruments like gold and Bitcoin. The rest of Asia depends on China, the world's second richest nation, for economic growth and stability. Therefore, the Chinese economic slowdown and currency devaluation could be leading other Asian citizens to safe haven and pan-global monetary instruments. In Japan, short-term interest rates at negative forty basis points make the yen a less than attractive currency to hold.

In Europe, the Brexit vote last June was likely the first shoe to drop on the political front. With the United Kingdom leaving the European Union, the economic might of Europe suffered a blow. In 2017, three other major E.U. member nations will go to the polls to elect leaders for the coming years. In the U.K., many voted to exit the E.U. because of immigration policies made in Brussels. These policies are not popular with many in other member nations and it is possible that the other member nations will also go rogue and decide to elect candidates that are not supportive of E.U. policy.

The first election will take place in March when citizens of the Netherlands go to the polls to elect a Prime Minister. A populist candidate is currently close to the top of the polls. In April, France will elect their next leader and Marine Le Pen, a far-right, anti-immigration, and anti-E.U. candidate is also receiving a lot of support in the polls leading up to the election. Later in this year, Germans will go to the voting booth to either give Chancellor Andrea Merkel another term or replace her with another candidate.

Germany is the largest and most influential economy in Europe. The Brexit vote in the United Kingdom started a nationalistic trend in Europe and if these three nations decide to reject the status quo in the months ahead, it will have dire ramifications for the future of the European Union and the euro currency.

Source: CQG

As the monthly chart of the euro currency highlights, the currency has declined from around the $1.40 level against the dollar in May 2014 to under $1.06, the lowest level since 2003. A rejection of the current leadership and those who favor the Union over a nationalistic solution will likely cause the euro currency to weaken further in the months ahead. Like in Japan, the short-term yield on the euro is at negative forty basis points and the European Central Bank continues to follow a course of quantitative easing making the euro currency a less than attractive instrument to hold.

The election of Donald Trump as the forty-fifth President of the U.S. was yet another blow to the trend towards globalism in the world. President Trump has pledged to "put America first" when it comes to relations with the rest of the world. The new administration ran on a platform that was against many multilateral trade agreements negotiated by former administrations. President Trump has told the world that trade agreements will be on a bilateral basis going forward.

He also told the rest of the world that protection comes at a price and that other allied nations around the world will need to start contributing their fair share as America has been shouldering the financial burden of keeping the world safe causing the nation's deficit to grow to over $20 trillion. A dramatic change in U.S. relations with the rest of the world is yet another reason for uncertainty and fear when it comes to the future of financial markets.

Gold and Bitcoin are moving higher so far in 2017 and the value of paper currencies are in question as citizens across many nations are going to the polls and expressing dissatisfaction with the status quo. It is interesting that a move away from globalism towards nationalistic candidates in the political world is causing gold and Bitcoin to appreciate. After all, in many ways, gold and Bitcoin are pan-global currency instruments that attract safe haven buying.

For centuries, gold has been not only a store of value, it has been an instrument used when the political climate creates the need for flight capital. When it comes to Bitcoin, the cryptocurrency is a means for people all over the world to avoid the manipulation and restrictions placed on money by central banks and governments so they can money wealth and savings around the globe. Many around the world are rejecting the politics of globalism in exchange for nationalism.

The world has become a smaller place because of advances in technology. The strength in gold and Bitcoin is telling us that many embrace a global view towards economics and that their money and wealth should not be under control of the governments and central banks in power. It will be interesting to see if global wealth and free flowing money that travels under the radar of governments can coexist with nationalistic political policies around the world and if the governments can do anything about the increasing popularity of these assets.

The Chinese have a saying that goes something like this, "May you live in interesting times." It will be interesting to see if the trend that started in 2016 with the Brexit vote and Presidential election in the United States continues in Europe and around the globe in 2017. Right now, both gold and Bitcoin are saying that the trend is firmly in place.

I have introduced a new weekly service through Seeking Alpha Marketplace. Each Wednesday, I will provide subscribers with a detailed report on the major commodity sectors covering over 30 individual commodity markets, most of which trade on U.S. futures markets. The report will give an up, down or neutral call on these markets for the coming week and will outline the technical and fundamental state of each market.

At times, I will make recommendations for risk positions in the ETF and ETN markets as well as in commodity equities and related options. You can sign up for The Hecht Commodity Report on the Seeking Alpha Marketplace page. Additionally, check out my website for more information about commodities.

Disclosure: I/we have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours.

I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

Go here to see the original:
What Are Bitcoin And Gold Saying About Paper Money - Seeking Alpha

Read More..

Opinion: Should you invest in a bitcoin ETF? – MarketWatch

Digital cryptocurrency bitcoin hit a record above $1,200 last week. Thats in large part because of speculation about the potential launch of the first-ever U.S. bitcoin ETF, but it also may be because of the uncertainty around all manner of investments in 2017.

After all, bitcoins advocates claim that it is a safe haven asset akin to gold. And according to a recent CNBC analysis, the digital currency has performed better than any other currency in every year since 2010 apart from 2014.

In an age where central banks in Europe and Japan continue to keep rates in negative territory and accusations of currency manipulation are a fixture of the Trump administration, you can understand why a digital currency like bitcoin has some appeal.

But its worth noting that bitcoin US:BTCUSD is notoriously volatile, and that its underperformance in 2014 was highlighted by a tremendous flop of about 70% from roughly $950 in January to under $300 at the end of that year. And since the currency has digital roots and was launched less than a decade ago, its also a popular target for internet crooks from small-scale phishing scams targeting would-be investors to hackers making off with a cool $65 million in bitcoin from Hong Kong exchange Bitfinex.

So whats the future of bitcoin? Will an ETF launch legitimize the digital currency and create a new option for investors looking to diversify into alternative assets? Or will the ups and downs of bitcoin continue, with a lucky few winning on their gamble even as volatility and outright criminal activity bankrupt others?

Here are some pros and cons of investing in this digital currency:

Long-term staying power: Bitcoin has reached roughly $19 billion in market value about 60% more than the total value of the digital currency during its previous peak in the 2014 bitcoin bubble. That rise hasnt been without serious volatility, of course, but the long-term gains in the last few years are dramatic as the currency has soared from roughly $15 as the start of 2013 to roughly $1,200 at present.

Playing nice with regulators: Despite a bitcoin user base that is sometimes generalized as libertarian or even anarchistic in their politics, there are many digital currency advocates who are quite comfortable playing by the rules of Washington and Wall Street going forward. A representative of the Bitcoin Foundation, for instance, told policy makers in 2013 that the organization wishes to craft a sane regulatory environment, and that it is comfortable with oversight so long as rulemaking is open and transparent. And outside the U.S., digital currency advocates in Australia and India are pushing self-regulation as a first step toward a shared set of rules in these marketplaces. This is all very good for the future of bitcoin as a legitimate alternative asset.

Former U.S. Chief Information Security Officer Brig. Gen. Gregory Touhill and Former Central Intelligence Director James Woolsey rank about the actors that pose the greatest threat to U.S. security. They speak with WSJ's Gerald Seib at the CIO Network in San Francisco.

Bitcoin isnt the problem bureaucracy is: There is a very real risk that the SEC will continue to drag its feet and we may not see a bitcoin ETF in the near future. But that could be a commentary on market bureaucracy rather than the future of bitcoin. As the former head of ETF listings at the New York Stock Exchange recently told MarketWatch, Bitcoin is new and different, and theres no incentive for regulators to be innovative. Even if there are setbacks, the rapid adoption of bitcoin is encouraging and like many technologies, it simply needs to wait for everyone else to catch up. Being an early adopter has been highly lucrative for bold investors in recent years, and things may only improve as the market and merchants catch up.

Crazy volatility: Bitcoin is hardly the only volatile investment out there. Take three-times leveraged gold miner investments Direxion Daily Gold Miners Bull 3X ETF NUGT, +1.79% and Direxion Daily Gold Miners Bear 3X ETF DUST, -2.19% as the poster children of aggressive, short-term instruments that can make a lucky few rich or bankrupt the unprepared.

In a typical retirement portfolio, there is no real place for bitcoin or leveraged ETFs or naked short selling or other risky strategies.

Former U.S. Chief Information Security Officer Brig. Gen. Gregory Touhill talks about some of the motivations of the hackers attacking U.S. companies. He speaks with WSJ's Gerald Seib at the CIO Network in San Francisco.

Hackers and scandals: The 2015 Silk Road scandal and the 2016 Bitfinex theft are pretty dramatic examples of the risks that come with an asset that isnt tangible. And even if you just isolated incidents, you have to acknowledge what such events do to investor sentiment. Just as the Wells Fargo WFC, +3.20% fraud scandal of 2016 had real consequences for the stock, further data breaches or bitcoin thefts could create huge headwinds for bitcoin investors and the adoption of the cryptocurrency. Throw in continued chatter about how bitcoin is the preferred currency for drug lords and sex traffickers, and even the most enthusiastic supporter must admit the risk of real tarnish to the bitcoin brand if these headlines continue.

Bitcoins big risk is its big reach: A 2016 report from a group of regulators that includes the Treasury, the SEC and the Federal Reserve warned that risks of bitcoin may not become apparent until they are deployed at scale and specifically highlighted potential problems arising simply because of the speed and volume of transactions.

There are plenty of other honest reasons to be concerned but when bitcoins biggest systemic challenge is simply executing transactions and reliably integrating into the financial system, that is a big red flag. It doesnt mean bitcoin doesnt have staying power, but it should warn investors of just how risky this currency remains despite talk of a mainstream ETF.

Read more here:
Opinion: Should you invest in a bitcoin ETF? - MarketWatch

Read More..

Top 5 Bitcoin Conspirary Theories – The Merkle

A lot has been said and written about bitcoin, even though there are some things which remain a mystery to everyone. No one knows for sure who created bitcoin, or why it was designed the way it is. Several conspiracy theories regarding bitcoin have popped up on the internet over the past few years, some of which seem more credible than others.

Back in Q4 of 2013, an intriguing conspiracy showed up on the bitcointalk forums. Someone floated the idea of how major governments specifically the US and China were buying up the biggest possible amount of bitcoin for themselves. Considering how both governments have opposed bitcoin since its inception, this theory is not as impossible as one might think. After all, at least government officials which have insider knowledge could tell people to look the other way while they scoop up cheap coins. Although this theory has never been proven, it is somewhat plausible.

Perhaps one of the more popular conspiracy theories is how bitcoin was created by the NSA as an experiment. Satoshi Nakamoto, whose real identity remains unknown to this very date, is believed to be a pseudonym for the NSAs top-secret cryptocurrency project. It is also believed the NSA has a backdoor to the SHA-256 algorithm, making bitcoin far less secure than what most people assume it is. Another theory that has never been provenyet could be worth enterntaining.

Considering China has always been keen to embrace bitcoin, many people believe it to be an invention by Chinese developers. While it is unclear as to why this would matter, theorists believe China may have created bitcoin to ultimately replace their reliance on the US Dollar. While it is true China has been working on a payment system together with Russia and others that can rival SWIFT, it is doubtful they would create bitcoin to replace US Dollars.

There could be a very good reason as to why no one has unmasked Satoshi Nakamoto yet. Despite the numerous manhunts taking place over the years, it is not unlikely Satoshi Nakamoto is not a person after all. Some people speculate the name is a pseudonym for an artificial intelligence created by an undisclosed government. This is by far one of the theories that is far out there, although it sparks some interesting discussions along the way.

Although this theory was merely provided to the world as a jest, there are some intriguing factors that seem to fall into place. Mt. Gox pushed the bitcoin value to higher limits, at which point it threatened the status of gold as the prominent safe-haven asset. JP Morgan orchestrated a bitcoin price crash, by selling over 8,000 coins to drive the price down.

At the same time, JPMorgan exploited withdrawal vulnerabilities on the Mt. Gox exchange, while funding anti-bitcoin articles greedily lapped up by mainstream media outlets. All of these concepts sound somewhat reasonable, depending on how large the tinfoil hat you wear is. Then again, it is true institutions such as JPMorgan see little merit in bitcoin, yet favor its underlying technology. It is very doubtful the institution would deliberately crash the bitcoin price, though.

If you liked this article, follow us on Twitter @themerklenews and make sure to subscribe to our newsletter to receive the latest bitcoin, cryptocurrency, and technology news.

Visit link:
Top 5 Bitcoin Conspirary Theories - The Merkle

Read More..

Kraken Acquires Yet Another Cryptocurrency Firm, Cryptowatch – Finance Magnates

Kraken, one of the largest cryptocurrency trading venues in the world, has acquired the charting and trading platform Cryptowatch.Used by thousands totrade up to 22 digital assets, Cryptowatch has seen rapid growth in the past two years increasing its active user-base by 700%.

The financial details of the acquisition have not been made, but it is revealed thatCryptowatch founder,Artur Sapek is joining Kraken to lead the development ofits interface as part of the deal while continuing to develop Cryptowatch.Kraken has also already leveraged Cryptowatch to release an upgraded trading interface based on the platform.

In just 2 years Cryptowatch grew into one of the pillars of the digital asset trading community, Sapek said. Teaming up with an exchange was the natural next step, and Kraken was my first choice. The Kraken team has built a very mature and reliable exchange, and I look forward to working with them to deliver the best trading software in the industry.

Im thrilled to welcome the Cryptowatch trading platform and its founder into the Kraken family, said Kraken CEO Jesse Powell. As the industrys leading charting tool for traders, we plan to devote more resources and talent to further enhance its offering. And weve purposed the technology to provide a great new charting and trading platform to Kraken clients as the first step in improving our own interface. Its a great start to 2017 and I cant wait to share what else weve got in the pipeline.

In 2016 Kraken acquired threemajor bitcoin exchanges(Coinsetter, Cavirtex and CleverCoin) as well as the bitcoin wallet funding service Glidera soon to be rebranded as Kraken Direct.

Original post:
Kraken Acquires Yet Another Cryptocurrency Firm, Cryptowatch - Finance Magnates

Read More..

Botnets mining cryptocurrency – Enterprise Times

You Are Here: Home 2017 March 1 Botnets mining cryptocurrency

Security vendor Forcepoint has identified a new mining botnet targeting the Monero cryptocurrency. Using bots to mine cryptocurrencies is nothing new and there have been several claims that botnets are targeting Monero over the last 15 months. This blog by Luke Somerville and Abel Toro goes further. It provides the evidence for an active botnet exploiting SMEs and local government systems in the Haut-Rhin region of France.

Using a botnet to mine for cryptocurrencies today makes sense. The complexity of the problems to be solved require an ever increasing amount of compute power. This has created a number of cooperatives where members join a mining consortium to share compute power and make money. What is happening here is that cybercriminals are looking for a more profitable route than being part of a mining consortium.

Somerville and Toro reference a Malwarebytes report from January. In that report, researchers looked at the use of the Sundown exploit kit to deploy a cryptocurrency miner. That mining tool was focused on Monero and was being actively updated. It appears from the Somerville and Toro blog that other campaigns to infect machines have been successful. Surprisingly both blogs call out the lack of obfuscation of the code used in the attacks. This has made it easy for the researchers to identify and examine the attacks.

The command and control servers are mainly hosted on legitimate websites. Interestingly the majority of those websites are hosted on OVH. This might be the attackers using sites based in France to get around security controls on the machines. The theory being that security software would expect users to access sites based in France rather than in Vietnam, Russia or China.

It is a surprise that we havent seen more campaigns aimed at botnet mining of cryptocurrency over the last few years. The increasing price of most cryptocurrencies is enough to make it attractive. Given the size of some botnets and their cost, it is certain cost effective. What is interesting is that this is targeted at cryptocurrency that is relatively unknown outside the DarkNet.

Ian has been a journalist, editor and analyst for over 30 years. While technology remains the core focus of Ian's writings he also covers science fiction, children toys, field hockey and progressive rock. As an analyst, Ian is the Cyber Security and Infrastructure Practice Leader for Creative Intellect Consulting Ltd. A keen hockey goalkeeper, Ian coaches and plays for a number of clubs including Guildford Hockey Club, Alton Hockey Club, Royal Navy, Combined Services, UK Armed Forces and several touring sides. His ambition is to one day represent England. Ian has also been selected to be the goalkeeping coach for Hockey for Heroes, a UK charity supporting the UK Armed Forces.

Follow this link:
Botnets mining cryptocurrency - Enterprise Times

Read More..

Nimble: Just as well our cloud storage runs in our own cloud, eh , eh? – The Register

Explainer Nimbles Cloud Volumes (NCV) store block data for use by Amazon or Azure compute instances, but the NCVs themselves are not stored in either Amazons Elastic Block Store or in the Azure cloud.

With remarkable timing Nimble made these claims just hours before the S3 outages, which had knock-on effects for EBS and other services the storage contender claimed the two cloud giants' infrastructure does not have the availability or reliability needed. Nimble staffer Dimitris Krekoukias quoted Amazon EBS documentation as an example to justify this stance:

He claimed: Every single customer Ive spoken to that has been looking at AWS had never read that link I posted in the beginning, and even if they had they glossed over the reliability part.

Krekoukias, blogging as RecoveryMonkey, claimed the following about ABS and Azure block storage:

Nimble says its data centre transactional applications, which use block storage, are inhibited from moving to cloud it can't guarantee, saying Nimble has built its own cloud to deliver the Nimble Cloud Volumes service.

Customers can choose a capacity, performance and backup SLA, and attach to either AWS and/or Azure. And "The users never see or touch a Nimble system in any way. All they see is [our] easy portal."

There are six "nines" storage uptime and data integrity is millions of times more than what native cloud block storage provides. Really?

In a separate blog Krekoukias writes: Nimble creates a checksum and a self-ID for each piece of data. The checksum protects against data corruption. The self-ID protects against lost/misplaced writes and misdirected reads

He says that, as well as block-level checksums, the Nimble storage does multi-level checksums:

Add this to the triple+ parity RAID scheme Nimble uses and Krekoukias thinks he is justified in saying NCVs are more than a million times more durable than EBS or Azure block storage.

He says that, with database IOs prioritised over low-latency sequential IOs, Nimble offers an IOPS SLA with its Cloud Volumes.

Krekoukias said Nimble is "still working on pricing but the monthly commitment is $1,500. The minimum customer commitment is 1 month. The minimum volume commitment is 1 day."

"So, a customer that's already paying $1,500 could create a huge volume temporarily to test something and then delete it in a few hours. We will only charge them for that day of use."

The Register has contacted Amazon for comment about the claims.

Nimble has built a public block storage cloud service, which means quite some investment in facilities and software. Obviously it thought this was the best, if not the only, way to get on-premises transactional block data availability and durability levels up to mission-critical type levels. That way it can continue to sell its storage facilities, on a cloud usage basis and integrate with its on-premises gear in a hybrid cloud model.

Of course its users are locked in to NCVs but, with NCV availability and durability being, as far as we know, unique in the public cloud arena that will be a trade-off its customers are willing to make.

There are alternative public clouds for backup data, like those from Backblaze and Carbonite, but block storage is quite another matter.

This Nimble public cloud block storage is certainly an individual marketing tactic and we wonder if other on-premises storage array suppliers will do the same thing. Wed point out that, as far as we know, no other stand-alone storage supplier is doing this. There are IBM and Oracle with their public clouds but these are system-level offerings. Dell (EMC), HDS, HPE and NetApp are not doing what Nimble has settled on.

It is, literally, a nimble offering. Were surprised, and say its great to see a small player shake up the cloud block storage market. Lets hope it builds up a sufficient customer base for it to withstand whatever pricing hammer blows Amazon might send its way in the future.

Read more:
Nimble: Just as well our cloud storage runs in our own cloud, eh , eh? - The Register

Read More..

Edge computing will blow away the cloud – CIO

SAN FRANCISCO -- The ubiquitous cloud computing craze may not be long for this world if venture capitalist Peter Levine is right. The Andreessen Horowitz general partner said that as more computing capabilities move to so-called "edge" devices, including anything from driverless cars and drones to the boundless devices that make up the internet of things (IoT), the cloud will slowly evaporate.

"A large portion of computation that gets done in the cloud today will return to the edge," said Levine at the Wall Street Journal's CIO Network event here Tuesday.

Levine said the driverless car, whose 200-plus CPUs effectively make it a "data center on wheels," is a prime example of an edge device whose computing capabilities must be self-contained. Levine said that an autonomous vehicle relying on the cloud for data would blow through stop signs and crash because of the latency associated with transmitting data from the car to the cloud. The cloud will also cripple many scenarios for machine learning, which relies on speedy computing to deliver faster decision-making.

Edge computing is less a novelty than perhaps the next computing cycle, Levine said. Decades ago most computing was centralized in mainframes, with banks and most other large enterprise relying on the refrigerator-sized cabinets to manage their business operations.

Many mainframes were decommissioned to make room for the decentralized client-server era. The cloud is essentially the new mainframe hosted in a vendor's data center. If the natural ebb and flow of computing holds, the edge will accelerate the next leg of distributed computing. And that means the cloud "goes away in the not-too-distant future," Levine said.

It's a scary proposition for the thousands of vendors hawking cloud services. For the past decade, Amazon Web Services, Google, Microsoft, Salesforce.com and others have stood up applications, infrastructure, storage and virtually every conceivable type of computing task imaginable as a service. But a venture capitalists job is to take a broader, longer view so that they can see what innovations are coming next. Levine is essentially saying the cloud disruptors will be disrupted in the next five to 10 years.

[ How to make hybrid cloud work ]

Diana McKenzie, CIO of cloud business application provider Workday, isnt buying the "provocative point" that the cloud will disappear. She said it will co-exist with the edge. For example, McKenzie said that companies will want to aggregate data collected from edge devices in a cloud for analysis and, ideally, business insights.

"I can't imagine there will ever not be a place for cloud computing," McKenzie tells CIO.com. "The challenge for us as CIOs is to make sure we're thinking about it more on a continuum than on a black and white basis. Then the next challenge is how you architect for that."

The cloud-to-edge debate was a hot topic, but it was hardly the only theme tackled by Levine and his panel peers -- Accel partners Rich Wong and General Catalyst Managing Director Steve Herrod offered some other thoughts about emerging trends.

From big data to machine learning: Big data 1.0 included collecting lots of information but the next wave involves predicting what is going to happen in the future, said Levine. "Machine learning unlocking these vast stores of information that we have... that can help us predict the future in better ways is absolutely happening right now," Levine said. For example, machine learning is used to predict cybersecurity attacks and IT system failures.

Wong said that enterprises can use machine learning to automate IT service functions, such as password resets for customers. Entrusting such corporate operations to machine algorithms can yield anywhere from 30 percent to 100 percent cost savings, the VCs said.

[ How to make hybrid cloud work ]

Bottoms up, the polite euphemism for shadow IT: Wong said that while VCs encourage portfolio companies to deploy a "land and expand" strategy and get into businesses through departments rather than going through the CIO,it's a delicate balance. The CIO must grapple with the challenges and risks associated with adopting potentially unproven technology but benefits from the speed of on-boarding employees. Thanks to the cloud, many are onboarding themselves. Levine says that shadow IT has extended to developers. "I've seen situations where if the organization doesn't provide what the developer needs, they go to another company to get services and tools," Levine says.

Proof-of-concepts -as-a-service: It's become fashionable for CIOs to fancy themselves as "IT-as-a-service" providers, essentially brokers of digital capabilities, including cloud, mobile, analytics and IoT. In this model, it makes sense for CIO to recognize that proof of concepts are a valuable way to evaluate new technology, Herrod said. He suggested that startups offer proof-of-concepts as a service to help.

The winner for hardest position for CIOs to hire goes to Data analysts: Levine says that if data is the most important ingredient in unlocking business value, then data scientists and analysts who can derive insights from the data and turn it into actionable information will be the toughest positions to fill.

Herrod disagrees and says that hes found the hardest hires are DevOps leaders because there is little consensus on what defines DevOps, a model for rapid software development model popularized by consumer internet companies. Herrod says that he's heard descriptions of DevOps managers range from scrum masters who run agile computing systems for speed and innovation to specialists who optimize cloud infrastructure.

Read more here:
Edge computing will blow away the cloud - CIO

Read More..

AWS Outage: Implications for Internet, Enterprise Cloud Customers – CIO Today

Yesterday's hours-long Amazon Web Services (AWS) outage provided a vivid illustration of how much large parts of the Internet depend on the cloud service. It also presented a puzzle for many users: because the AWS health dashboard itself depends on the cloud service, the status messages failed to indicate any signs of trouble throughout the outage.

Now resolved, the Feb. 28th outage of Amazon's S3 (Simple Storage Service) cloud-based object storage service caused many Web sites to be inaccessible or slow to load for several hours. Affected sites and services included Adobe, Coursera, Cracked, Imgur, Mailchimp, Medium, Quora, Slack, Trello as well as Internet health-tracking sites such as Downdetector and Is It Down Right Now.

S3 is an "object storage with a simple Web service interface to store and retrieve any amount of data from anywhere on the Web," according to Amazon. Used by more than 150,000 Web sites, S3 is designed for up to 99.99 percent availability. Yesterday's outage illustrated that one-in-ten-thousand chance of non-availability.

Problem at Virginia Data Center

While Amazon's cloud service health dashboard gave no indication of trouble, yesterday morning AWS noted on its Twitter account that S3 was "experiencing high error rates" that the company was working to recover. Because the dashboard wasn't showing alert color changes due to the S3 issue, Amazon also posted updates in a banner at the top of the Web page.

By 1:49 p.m. PST, all S3 services for object retrieval, listing, deletion and addition had been recovered and were back to working normally, Amazon said. The company said that the outage was traced to its US-EAST-1 gateway location, which is its data center in northern Virginia.

During the outage, Twitter became the place for various AWS customers and others to share information as well as to vent and post humorous items about the event. Adobe Customer Care, for example, posted a GIF of a puppy stampede to take customers' minds off the service outage, while another popular meme was a screenshot of Homer Simpson's dad with the headline, "Old Man Yells at Cloud."

Enterprises Need 'Balanced Approach'

In an analysis published today in Forbes, analyst Patrick Moorhead said yesterday's outage underscored a problem not with Amazon, but with enterprise users who don't fully consider the implications of moving key services into the cloud.

"This incident is an indictment, not of AWS or Amazon.com, but of business and IT decision makers," said Moorhead, who is founder, president and principal analyst at Moor Insights & Strategy. "Too often the decision to move IT services to the public cloud was driven by either cost or the thought that 'we need to get to the cloud to be competitive.' But not understanding the value that your IT can deliver today shortchanges the business."

While it makes sense for many enterprises to move some workloads into the public cloud, other services require a more balanced approach that might include use of private cloud as well as legacy systems, according to Moorhead.

The non-profit Institute for Local Self-Reliance made a similar observation in a report about Amazon published in November. "Amazon increasingly controls the underlying infrastructure of the economy," the report noted. "Its Amazon Web Services division provides the cloud computing backbone for much of the country, powering everyone from Netix to the CIA."

In its most recent quarterly financial report issued in early February, Amazon said its AWS operating income for the 12 months ending Dec. 31 amounted to $3.1 billion, compared to $1.5 billion for same 12 months in 2015.

Read more:
AWS Outage: Implications for Internet, Enterprise Cloud Customers - CIO Today

Read More..

Salesforce CRM Helping Healthcare Providers Boost Service – Top Tech News

By Shirley Siluk / Top Tech News. Updated March 01, 2017.

Now resolved, the Feb. 28th outage of Amazon's S3 (Simple Storage Service) cloud-based object storage service caused many Web sites to be inaccessible or slow to load for several hours. Affected sites and services included Adobe, Coursera, Cracked, Imgur, Mailchimp, Medium, Quora, Slack, Trello as well as Internet health-tracking sites such as Downdetector and Is It Down Right Now.

S3 is an "object storage with a simple Web service interface to store and retrieve any amount of data from anywhere on the Web," according to Amazon. Used by more than 150,000 Web sites, S3 is designed for up to 99.99 percent availability. Yesterday's outage illustrated that one-in-ten-thousand chance of non-availability.

Problem at Virginia Data Center

While Amazon's cloud service health dashboard gave no indication of trouble, yesterday morning AWS noted on its Twitter account that S3 was "experiencing high error rates" that the company was working to recover. Because the dashboard wasn't showing alert color changes due to the S3 issue, Amazon also posted updates in a banner at the top of the Web page.

By 1:49 p.m. PST, all S3 services for object retrieval, listing, deletion and addition had been recovered and were back to working normally, Amazon said. The company said that the outage was traced to its US-EAST-1 gateway location, which is its data center in northern Virginia.

During the outage, Twitter became the place for various AWS customers and others to share information as well as to vent and post humorous items about the event. Adobe Customer Care, for example, posted a GIF of a puppy stampede to take customers' minds off the service outage, while another popular meme was a screenshot of Homer Simpson's dad with the headline, "Old Man Yells at Cloud."

Enterprises Need 'Balanced Approach'

In an analysis published today in Forbes, analyst Patrick Moorhead said yesterday's outage underscored a problem not with Amazon, but with enterprise users who don't fully consider the implications of moving key services into the cloud.

"This incident is an indictment, not of AWS or Amazon.com, but of business and IT decision makers," said Moorhead, who is founder, president and principal analyst at Moor Insights & Strategy. "Too often the decision to move IT services to the public cloud was driven by either cost or the thought that 'we need to get to the cloud to be competitive.' But not understanding the value that your IT can deliver today shortchanges the business."

While it makes sense for many enterprises to move some workloads into the public cloud, other services require a more balanced approach that might include use of private cloud as well as legacy systems, according to Moorhead.

The non-profit Institute for Local Self-Reliance made a similar observation in a report about Amazon published in November. "Amazon increasingly controls the underlying infrastructure of the economy," the report noted. "Its Amazon Web Services division provides the cloud computing backbone for much of the country, powering everyone from Netix to the CIA."

In its most recent quarterly financial report issued in early February, Amazon said its AWS operating income for the 12 months ending Dec. 31 amounted to $3.1 billion, compared to $1.5 billion for same 12 months in 2015.

Visit link:
Salesforce CRM Helping Healthcare Providers Boost Service - Top Tech News

Read More..

Bouncing Back To Private Clouds With OpenStack – The Next Platform

March 1, 2017 Timothy Prickett Morgan

There is an adage, not quite yet old, suggesting that compute is free but storage is not. Perhaps a more accurate and, as far as public clouds are concerned, apt adaptation of this saying might be that computing and storage are free, and so are inbound networking within a region, but moving data across regions in a public cloud is brutally expensive, and it is even more costly spanning regions.

So much so that, at a certain scale, it makes sense to build your own datacenter and create your own infrastructure hardware and software stack that mimics the salient characteristics of one of the big public clouds. What that tipping point in scale is really depends on the business and the sophistication of the IT organization that supports it; Intel has suggested it is somewhere around 1,200 to 1,500 nodes. But clearly, just because a public cloud has economies of scale does not mean that it passes all of those benefits on to customers. One need only look as far as the operating profits of Amazon Web Services to see this. No one is suggesting that AWS does not provide value for its services. But in its last quarter, it brought nearly $1 billion to its middle line out of just under $3.5 billion in sales and that is software-class margins for a business that is very heavily into building datacenter infrastructure.

Some companies, say the folks that run the OpenStack project, are ricocheting back from the public cloud to build their own private cloud analogues, and for economic reasons. Luckily, it is getting easier to use tools like OpenStack to support virtual machine, bare metal, and container environments. This is, of course, a relative thing, too. No one would call OpenStack easy, but the same holds true for any complex piece of software such as the Hadoop data analytics stack or the Mesos cluster controller, just to call out two.

People are realizing that the public cloud, and in particular the hyperscale providers like AWS, Google, and Microsoft, are really in some cases the most expensive way to do cloud computing, Mark Collier, chief operating officer at the OpenStack Foundation, tells The Next Platform. There is was this misconception early on that, because of economies of scale, the hyperscale clouds would be cheaper. But if you look at the earnings releases of AWS and others, their growth rates are slowing in the public cloud, and we think that has a lot to do with cost. So we are starting to see some major users of AWS standing up private clouds powered by OpenStack and moving certain strategic workloads off of AWS and repatriating them internally. There are a lot of reasons for this, but cost is the biggest driver.

OpenStack users have, over the past three years, moved from tire kicking to production

The public cloud is worth a premium over private clouds for a bunch of reasons, not the least of which customers using public clouds do not have to pay the capital costs of infrastructure or the management costs of making it run well. And having the ability to do utility priced, instant on and off capacity is also worth a premium, and we know this because steady state rental of capacity on clouds costs less than on-demand capacity. (As we would expect.) But, says Collier, a lot of customers have steady state workloads that just run, and even though there are ways to bring the costs down on public clouds where the virtual machines just sit there, day in and day out, customers moving off AWS to a private cloud can see anywhere from a 50 percent to 70 percent cost reduction for these constantly running jobs.

Those are some big numbers, and we would love to see this further quantified and qualified.

Collier also points out that OpenStack is used to create dozens of big public clouds, so it is not just a private cloud technology. (Rackspace Hosting, one of the co-founders of OpenStack along with NASA, operates what is probably the largest OpenStack cloud for its Cloud Servers and Cloud Storage services.)

Public and private clouds are all growing, but customers are getting more strategic about where to place their workloads so they get their moneys worth, says Collier. And if you are paying to turn resources on and off, and you are not doing that, then you are wasting your money. People are no longer wondering when they are moving to clouds they pretty much know everything is going in a cloud environment. But now they are thinking about which type makes sense. People are starting to dig into the numbers.

It is hard to say how much of the compute, storage, and networking capacity installed worldwide and only running enterprise applications at that is on private clouds versus public clouds versus traditional uncloudy infrastructure. And Collier was not in a mood to take a wild guess about how, years hence, this pie chart might shake out. But he concurred with us that it might look like a 50-50 or 60-40 split between private and public cloud capacity over the long haul. A lot will depend on economics, both in terms of what the public clouds charge and what enterprises can afford in terms of building their own cloud teams and investing in infrastructure.

If Amazon Web Services offered a private cloud you could plunk into a datacenter, and at a private cloud price, this would certainly change things. But it also might make AWS a whole lot less money, which is why we think maybe the top brass at AWS are not so keen on the idea. They might be able to double, triple, or quadruple their aggregate compute and storage, but not make more money doing it unless customers decide to use AWS management on their baby and private AWS clouds. Should they ever come to pass.

And having a private VMware cloud running in AWS datacenters, as will be done this year, does not count. We are not sure of much in this world, but we fully expect for capacity on this VMware Cloud on AWS service to cost considerably more than hosting a private cloud based on the ESXi hypervisor, vCenter management tools, and some of the vRealize cloud management tools.

There are a couple of things that are giving OpenStack a second wind, and it is not just the backdraft effect off of big public clouds by enterprise customers.

For one thing, OpenStack is getting more refined and more polished, as we demonstrated by the Ocata release that was put out by the community two weeks ago. This release had a relatively short development cycle, coming out about two months ahead of the usual cadence, but the future Pike release will get back to the normal six-month release cadence that OpenStack has adhered to for years now.

One big change with the Ocata release of OpenStack is that the horizontal scaling mechanism for the Nova compute portion of OpenStack, called Cells, has gotten a V2 update and is not only ready for primetime, but is running with Nova by default starting with Ocata. In essence, Cells allows for multiple instances of the Nova compute controller (including its database and queue) to be distributed in a single cluster and be federated for management. Cells was developed by Rackspace and has been used in production since August 2012 and has been in development formally for OpenStack the Grizzly release back in 2012, and it can be used to federate clustered Nova controllers within a datacenter or region or across regions.

Nova also now includes a feature called the placement and resource scheduler it does not yet have a funky name because it has not been busted free of Nova, but Jonathan Bryce, executive director of the OpenStack Foundation, says that this scheduler could eventually be broken free and used to control certain aspects of other portions of the OpenStack stack. This is a new way of managing the assets that comprise a cloud servers, storage devices, networking equipment, and so on adding intelligence to that placement. So, for instance, it tracks the kinds of devices and their capacities and performance, and with a set of APIs you can request that a workload be deployed on a specific collection of resources and this scheduler can find it and make it happen through Nova.

The first and second generations of cloud, according to OpenStack

The idea is that we are on the second generation of clouds, and they are easier to run and that makes them more cost effective and also opens them up for deployment by more people, says Bryce, which sets up a virtuous cycle. But the other attribute of Gen 2 clouds is that they do more things. When OpenStack was just starting, it was basic virtualization with self service and elastic provisioning. When you look at it now, what you see are cloud native applications, but also things like SAP and network function virtualization workloads. So the private cloud today costs less, but it also does more. So having a more intelligent scheduler that makes sure you put an NFV workload onto a server that has high performance networking gear, or you put a data analytics workload onto something that has high performance I/O, these are the things that end up making these new clouds extremely capable and able to run these new workloads.

And this is also why OpenStack use is exploding in new markets, particularly China, where there is no established virtualization player and lots of companies are greenfield installations.

With OpenStack now seven years old, it has become a reasonably mature platform thanks to the hard work of thousands of software engineers and the enlightened self-interest of their employers. And it is reasonable to ask if OpenStack, like other open source infrastructure components like the Linux kernel and the bits that wrap around it to make it an operating system, is largely done.

OpenStack has thousands of marquee enterprise customers, and this is just a sampling

Theres always something more to do, says Bryce. OpenStack is an interesting animal in some ways because it has these very primitive core functions such as virtualization and networking, and those are necessary for every single workload, every single application that runs on any platform. Those are key, and fairly stable and mature. Where we are seeing exciting work still happen is how you leverage and integrate these infrastructure primitives to meet new workloads.

For instance, a lot is happening in the OpenStack community with software containers right now. Not only is OpenStack being containerized itself so it can be deployed and managed better, but containers are being added atop either virtualized or bare metal OpenStack clouds so they can be used to manage other applications that in turn run on OpenStack.

When you layer dynamic application management through containers on top of programmable infrastructure, you really get the best of both worlds, Bryce explains. But in order to achieve this, you need tight integration between the two.

Just as was the case with server virtualization based on hypervisors when it became popular on X86 platforms a decade ago, there is much weeping and gnashing of teeth with regards to both networking and storage underpinning container environments. So OpenStack shops are combining the Neutron virtual networking with Cinder block storage and the Kubernetes container scheduler, or gluing together Nova compute with Cinder block storage and Docker container runtimes. The Kuryr project provides the link between Docker and Neutron, hence its name courier, and a subproject called Fuxi connects Cinder block storage and Manila shared file systems to Docker in a similar fashion.

Categories: Cloud, Compute, Enterprise

Tags: container, Docker, OpenStack

Making Remote NVM-Express Flash Look Local And Fast Looking Down The Long Enterprise Road With Hadoop

See original here:
Bouncing Back To Private Clouds With OpenStack - The Next Platform

Read More..