Page 4,484«..1020..4,4834,4844,4854,486..4,4904,500..»

How to add cloud-based bookmarks to your Nextcloud server – TechRepublic

Image: Jack Wallen

Nextcloud is one of the most flexible and easy to use cloud servers. Not only is Nextcloud an incredibly simple cloud server to get up and running, but you can expand its capabilities through a user-friendly app store.

An especially useful feature that can be included in the Nextcloud server is the cloud-based Bookmarks app. With this feature, users can add and tag bookmarks to be saved in their personal Nextcloud account. This makes it easy for users to reach their work-based bookmarks from any browser that can access the Nextcloud server.

I'll walk you through the process of adding the Bookmarks app and then illustrate how easy it is to work with. I assume your Nextcloud server is set up and running, and that you have an administrative account on the server.

SEE: Video: The 5 trends that form the future of cloud computing (TechRepublic)

Nexcloud will download and install the app. Once it's installed, the app is ready to gothere are no settings for the app.

Click the Apps drop-down and select Bookmarks from the list. In the new window, you will see a warning that you have no bookmarks. Nextcloud makes it easy to add bookmarksit includes a bookmarklet that you can drag to your browser's toolbar (Figure A).

Figure A

The Nextcloud bookmarklet is ready to be added to your browser.

Once you drag the bookmarklet to your browser, you can add bookmarks to Nextcloud by pointing your browser to the site you want to bookmark and then clicking the bookmarklet. In the window that appears, you can edit the bookmark's name, tag, and description (Figure B).

Figure B

Adding a TechRepublic Cloud bookmark to Nextcloud.

After the bookmark is added, you can return to your Nextcloud server, click the Apps drop-down, click Bookmarks, and view/edit your current listing of bookmarks. If there's a URL you want to visit, click the name (Figure C) and a new tab will open to that link.

Figure C

Bookmarks are easily accessed, edited, and tagged.

You can also add new bookmarks from within the Nextcloud Bookmark app by typing in the URL in the Address textarea and clicking the + button. Nextcloud will call out to the URL to get the name of the site and automatically add it to the bookmark.

The Bookmarks app can import and export bookmarks, which means you can export a browser's bookmarks and then import them into Nexcloud or vice versa. To get to this feature, click the Bookmarks app from the Apps drop-down, click the small gear icon in the bottom right of the window, and click either Export or Import (Figure D). The only downfall of the Import feature is that your bookmarks won't be tagged, so you'll have to do that manually.

Figure D

Importing and exporting bookmarks.

The Bookmarks app helps make your life a tiny bit easier. This feature may not change the way you work, improve your ROI, or secure your network, but it can extend your Nextcloud server offerings and help users get the most out of your company's hosted cloud. And with the help of the Nextcloud bookmarklet, adding bookmarks to a Nextcloud account is incredibly simple.

Read the rest here:
How to add cloud-based bookmarks to your Nextcloud server - TechRepublic

Read More..

Microsoft Azure Cloud Gaining Traction in Business – Fortune

Use of Microsoft Azure and Google Cloud Platform is on the rise, according to The 2017 State of the Cloud Report compiled by cloud management company RightScale . That's good news for what are generally seen as the number two and three players in public cloud, respectively, following market leader Amazon Web Services.

A survey of just over 1,000 IT professionals on their usage of public cloud showed Microsoft Azure adoption up to 34% from 20% last year. Likewise, 15% of respondents used Google ( goog ) Cloud this year, up from 10% a year ago. Twenty percent of the survey respondents are RightScale customers and 61% hail from the U.S.

Meanwhile, AWS remains the market leader, with usage holding steady at 57% year-over-year. Penetration of IBM ( ibm ) rose slightly from 7% to 8% while Oracle ( orcl ) Cloud adoption fell slightly from 4% to 3% over the same period.

Get Data Sheet , Fortunes daily tech newsletter.

Public cloud refers to a massive array of servers, storage, and networking aggregated and managed by one company, which then rents out those resources to many customers. This model is gaining traction as companies seek to augment or even replace their own data centers, which is why many legacy tech companies have jumped into a market AWS kicked off in 2006.

As Fortune reported earlier Wednesday, many big software companieslike Salesforce ( crm ) , Workday ( wday ) , Box ( box ) , SAP ( sap ) , and Inforthat typically stream their software from their own data centers are also using AWS, Microsoft, or other public cloud facilities to run some of their new products.

Among a subset of bigger "enterprise customers" RightScale surveyed, 59% said they use Amazon ( amzn ) up from 56% last year. Azure, meanwhile, saw a bigger jump with 43% of respondents on Azure now, compared to 26% last year. Microsoft ( msft ) , by virtue of its Windows-and-Office franchises, is already established in many of these accounts and is pushing them to embrace Azure as well. Companies in this enterprise category have 1,000 or more employees.

However, there's a not-so-silver lining to all this adoption. While public cloud is sold as a relatively inexpensive alternative to buying and running your own servers typically at partial capacity much of the time, the model leads to waste.

If a developer powers up a bunch of cloud servers to run a job and forgets to shut them down when it's done, the company pays for unused computing cycles. And customers are realizing this. More than half (53%) of the respondents said cost optimization is a top priority, and that percentage is higher (64%) for experienced cloud users who have been at it for a while.

For more on Microsoft Azure, watch:

For one thing, a lot of early cloud usage was initiated by a single team of developers trying to get a specific job done, often in a way that was not sanctioned by (or even known to) corporate IT. But all those one-off jobs add up, and if IT has no window to what's going on, costs can get funky, fast.

While respondents think they waste 30% of their cloud spending, RightScale says the real number is likely between 30% and 45%. RightScale sells services that help customers manage workloads running in different clouds and to help minimize waste. Experienced IT pros know the downside of bad cloud practices now and are striving to nip them in the bud.

Continued here:
Microsoft Azure Cloud Gaining Traction in Business - Fortune

Read More..

Discover the pros, cons of bare-metal cloud services – TechTarget

In some cases, public cloud services don't offer admins full visibility and control, particularly around variable...

Enjoy this article as well as all of our content, including E-Guides, news, tips and more.

By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.

You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

workload performance and security. Some providers address this challenge by offering bare-metal cloud.

What are bare-metal cloud services?

A bare-metal cloud service is a variation on infrastructure as a service (Iaas) that allows users to rent and configure single-tenant servers, typically without a virtualization layer. Bare-metal cloud promises the flexibility and scalability of public cloud, with the predictability, granularity and security of local servers.

Not all workloads run well on the virtualized cloud instances that make up many IaaS offerings. For example, legacy applications that demand access to physical hardware, or workloads that are extremely stable and require no scalability, might be better fits for bare-metal cloud.

Bare-metal cloud services are similar to other cloud services -- they're accessible like Amazon Web Services (AWS) Elastic Compute Cloud (EC2) instance or an Azure D-series VM. The primary difference with bare-metal is that the service maps to a physical server rather than a VM. Since the server is rarely virtualized, users get complete access to the entire server directly and all of its compute, storage and network resources. A bare-metal cloud instance is almost indistinguishable from a more traditional server, but typically uses the same on-demand rental model that public cloud providers use.

There are numerous providers in the bare-metal cloud market, including Oracle, IBM, baremetalcloud, Rackspace and Internap.

Major public cloud providers, including Azure and Google, do not have strong bare-metal cloud offerings. AWS comes close with its Dedicated EC2 Hosts service, which commits an entire physical server to the user.

What are the pros and cons of bare-metal cloud services?

The primary difference with bare-metal is that the service maps to a physical server rather than a virtual machine.

The most notable benefit of bare-metal cloud services is direct control of the server and its resources. This is a far cry from typical virtualized cloud instances, which intentionally obscure underlying hardware operations from the users. Additionally, without the virtualization layer, bare-metal cloud reduces the overhead of a hypervisor, which can increase performance.

Because most public IaaS environments are multi-tenant, organizations are concerned with security and compliance. Bare-metal cloud instances address both of these concerns because they provide a single-tenant hardware platform committed exclusively to a single user. However, bare-metal clouds do not guarantee security or compliance; organizations need to understand the legal obligations and industry best practices for proper security and regulatory posture.

Bare-metal cloud instances can also be more cost-effective than public cloud instances because users generally pay only for the underlying hardware rather than usage. However, organizations must consider the costs and compare to in-house hardware acquisition, deployment and operational expenses.

However, compared to virtualized instances, bare-metal cloud services can be limited. For example, when a physical server is virtualized, admins can provision a wide range of standardized VM types from the underlying hardware. But a bare-metal cloud instance is a complete server, so there is a limited number of instance sizes and types available.

What makes bare-metal cloud management unique?

There are few fundamental differences in managing bare-metal cloud instances versus typical virtual machine public cloud instances.

For example, bare-metal cloud providers like Rackspace and Oracle provide management interfaces, including a console and command-line interface. The Rackspace control panel offers a web-based interface to start the cloud server, access and view KPIs, schedule tasks like snapshots and access support functions. Admins can use control panels to perform tasks such as server resets and power cycling -- tasks unthinkable with common VM instances.

However, bare-metal servers generally require a more granular level of management and control than common cloud instances. For example, Oracle organizes bare-metal cloud services into entities called compartments, which provide isolated cloud instances to other business units or projects. This means it is possible to monitor, manage and track billing for resources by activity or group, and assign granular security and data protection characteristics to each individual user. Admins familiar with managing VM-based public cloud instances may see an additional learning to set up and manage a bare-metal cloud environment.

Is bare-metal cloud a better option for higher-level services?

Bare-metal cloud services can be a good option when a workload's computing demands are relatively constant due to lack of scalability. Workloads that fit this model include those involved with big data analytics, backup and recovery cycles, media encoding tasks, machine learning, visual rendering projects and other I/O-intensive applications.

For example, a big data workload needs to ingest a large amount of data, transform and process that data and then pass results back to storage. Once completed, the server may remain unused for weeks or even months. This makes bare-metal cloud a viable option, rather than buying and owning servers permanently.

Find out if bare-metal cloud services are right for you

Evaluate the architectural, cost benefits of bare-metal

Do containers run better on bare-metal servers?

Link:
Discover the pros, cons of bare-metal cloud services - TechTarget

Read More..

Welcome to the Era of Great Data Center Consolidation – Fortune

"Friends don't let friends build data centers," said Charles Phillips, chief executive officer of Infor, a business software maker, about two years ago.

His remarkwhich launched a flood of T-shirts came at an Amazon Web Services conference in April 2014, when he announced that Infor would move all of its IT operations into AWS data centersand out of its own.

That was a bold statement at the time, and it presaged a change in how big software companies are building and distributing their products.

Since then, a slew of these software companiesincluding Box ( box ) , Salesforce , NetSuite , Tableau ( data ) , Workday, and SAP ( sap ) have all announced plans to use a public cloud (like AWS) as a deployment option for their own software.

A public cloudreferred to within the industry as Infrastructure-as-a-Service (IaaS)consists of massive quantities of servers, storage, and networking owned-and-operated by one company, and then rented out to others.

Box and Workday , for example, are turning to AWS and IBM ( ibm ) SoftLayer to run specific types of software. Salesforce has signed on to AWS for its Internet of Things Cloud Suite and other new products. NetSuite named Microsoft Azure as its " preferred cloud partner ." (Although now that NetSuite has been bought by Oracle ( orcl ) , which has its own public cloud ambitions, that could change.) JDA Software, which offers supply chain management software, is using Google Cloud Platform.

None of these companies are going as far as Inforat least not yet. But unless these announcements are 100% marketing hype, they indicate that much more software power will be concentrated in fewer, albeit bigger, data centers owned and operated by just a handful of cloud computing giants.

The arguments seem sound: Let the big cloud providers pull together hundreds of thousands of servers, petabyte upon petabyte of data storage, and networking at their data centers around the worldand software companies can just piggyback on that. That allows the software maker to concentrate on features and functions for customers, not racks of servers and power supplies in the data center running either in a dedicated corporate data center or in facilities owned by third-parties like Equinix ( eqix ) , Digital Realty Trust, or Dupont Fabros. But build or no build, that's a major expense.

And, if a U.S. software company throws in with a cloud provider with data centers around the world, it won't have to eat the expense of operating a data center in a country with strict data sovereignty laws requiring that local data stay in country. That is no small benefit.

Get Data Sheet , Fortunes daily tech newsletter.

But isn't there concern about relying so heavily on a small handful of huge playersnamely Microsoft, Amazon, and Google? Some software executives admit there is. Privately, they worry that their cloud provider might end up being as competitor as well.

Microsoft ( msft ) , for example, offers its own cloud-based applications, which compete with offerings from many other companies. Amazon ( amzn ) keeps adding higher level software to its infrastructure as well. Who knows where that will end?

Marc Benioff, chief executive of Salesforce( crm ) , downplays the concern, noting that there is no shortage of competition in cloud. Aside from Amazon, Google , IBM, and Microsoft, Benioff cites the various telcoslike Deutsche Telekom, BT , and NTTthat also offer cloud services.

And, Salesforce is not leaving its own data centers behind. "Our infrastructure remains our primary focus as public cloud is still too expensive for many of the primary markets we are in," Benioff tells Fortune via email.

So while Salesforce is working with AWS in Canada and Australia, it will keep using its own data centers as well.

"This is not a one-size fits all solution for infrastructure," says Benioff.

David Clarke, senior vice president of technology development at Workday ( wday ) , which specializes in financial and human resources software for big companies, agreed that expanding geographic reach is a big reason to consider public cloud. Amazon, Microsoft Azure, and IBM SoftLayer maintain data centers all over the world.

"Our first production workload will go on AWS in Canada this year," Clarke tells Fortune . "Whether or not you like running data centers, it's expensive and being able to offer geographic diversity without having to build data centers everywhere is a benefit." Workday itself has three primary data centers in Ashburn, Va.; Portland, Ore.; and Dublin, Ireland.

For more on Dropbox's infrastructure move, watch:

Another factor is that big businesses have grown more comfortable with the use of massive public clouds, and may push other software suppliers to put services in the customer's cloud of choice.

SAP offers versions of many of its productslike SuccessFactors human resources and Concur travel and expense accounting softwarefrom various cloud partners. Right now, however, SAP's new public cloud-based SAP S/4Hana manufacturing and financial management software runs only on SAP's own public cloud infrastructure. However, the company will consider "the usual suspects" among the giant public cloud playersif demand warrants it, Darren Roos, president of the company's S/4Hana business unit tells Fortune .

As many SAP customers already run some software in AWS or Azure, they may want to "co-locate" the new SAP software there as well, Roos suggests.

But there are some in the tech industry who say that there are many companies of sufficient size with specialized needs that would be better off running their own data centers, but in a cloud-like manner. In this "private cloud" model, all of the computing gear is devoted to one company, but still offers internal users flexibility in adding or deleting resources as needed. And each department or user can be charged for what capabilities they use.

"There are plenty of companies big enough to operate their own data centers," Bryan Cantrill, chief technology officer of Joyent said recently. And that is likely one huge reason Samsung Electronics bought Joyent for its cloud expertise last year.

And then there is Dropbox, the popular file storage and file sharing company that ran almost entirely on AWS until it quietly moved 90% of that workload into its own data centers over the course of a few years. Companies like Dropbox say they know their own needs better than a public cloud provider, and that they're big enough to make the economics work. Their argument is that once a company's workload gets big enough and stable enough not to need all that public cloud flexibility, it is cheaper to run your own data center gear.

Still, it seems that if the public cloud is a viable option, more corporate workloads will flow in that direction. Oracle co-chief executive Mark Hurd has said he expects 80% of corporate data centers to disappear by 2025. Even if he's a little bit off, that's a considerable estimate.

This shift could be generational. A company like Boxfounded in 2005, a year before AWS launched its first cloud servicedidn't have much of a choice but to provision its own data centers. If the content management company were starting out now, Box chief executive Aaron Levie says that would likely not be the case.

When asked if it was hard to rely on a huge cloud provider that might end up competing with Box, Levie replies, "You know what's harder? Building data centers!"

Originally posted here:
Welcome to the Era of Great Data Center Consolidation - Fortune

Read More..

How open compute cuts server costs in the enterprise – CIO

The open compute project (OCP) means you can get the designs that Microsoft, Facebook and (to a lesser extent) Google use for their data centers. The goal is to get original design manufacturers (ODMs) to build them for you rather than buying standard servers and switches from original equipment manufacturers (OEMs).

Six years on, its still mainly the largest of companies with the largest of data centers that are buying OCP designs, Forrester principal analyst Richard Fichera tells CIO.com. Some of the larger financial services and large manufacturers are actively implementing OCP designs or OCP-like designs.

Even so, he says, the project has had a big impact on the hardware that enterprises and mid-size companies buy. Buying OCP-compliant systems is really still a high-end phenomenon but theres been a more subtle ripple effect over the last few years thats reached a long way.

>> Register now to read the full article <<

When OCP started, almost all the major server vendors had what they called value product lines; systems that were stripped down and a little less expensive, with specs that were a little lighter all the way round, from connectors to power supply efficiency, Fichera says. OCP really highlighted a lot of the ways they could take further cost out of their servers and, by this point, all the major vendors have servers that have been influenced by the OCP designs. You can see that by the way they design systems and the resulting pricing of the entry level of value systems.

But you wont see those cost-cutting measures reflected in the list prices, Fichera says. They dont necessarily advertise as this is a dirt cheap alternative to run your software but it gives them a tremendous amount of wiggle room when it comes to discounting.

[ A peek inside Microsoft Azures open source server and rack designs ]

Bringing prices down was always part of the OCP plan, says Open Compute Foundation CEO Corey Bell. We believe that everything should be more open because it allows for greater collaboration and communications; it drives down costs and it drives up effectiveness and efficiency. Where we used to be in the industry, vendors got input from customers but customers didnt have a seat at the table to design products. Collaborating and designing openly is better for everyone, and it drives down costs.

A lot of the OCP designs were too wild and crazy for enterprise data centers, but they have some very mainstream 2U two socket designs with very standard server specs now and thats trickled out into the industry, says Fichera.

Thats exactly what Bell wanted to see happen. Were trying to make it easy for everyone to consume OCP and were pushing to overcome more of the hurdles. As weve brought on more component suppliers and more service providers as well as full products, were getting closer. When a product is submitted to OCP, we think about who would this work for and how would they get it.

What OCP has already done is make cheaper systems available to everyone, albeit indirectly. The difference between enterprise-class servers and a high-volume cloud server could be as much as 20 percent, Fichera says, but he adds that theres now little difference in price between a value server from the traditional server suppliers and an OCP design.

What keeps many enterprises buying those value servers rather than OCP designs often has nothing to do with the hardware design, Fichera says. Medium-size companies are still very sensitive to service arrangements, he says, giving the example of companies buying several hundred servers who had been approached by server ODM Quanta with a slightly lower price than the traditional vendors, and turned down the deal.

The rationale behind the decision is that they werent ready, for that price difference, to take the risk of what it means to be supported by a company like Quanta, Fichera says. They fully understood that with a couple of hundred servers that they wouldnt be getting the focused attention of Dell or HP [that a larger customer would get], but they were nervous about relying on a company with no visible presence in end user support.

Thats an area OCP is working on, Bell tells CIO.com. At the end of the day, its not just about deployment; for mid-range and smaller businesses, distribution and support is key. Thats where some of the major vendors have done a great job. He notes that Dell and HP are already part of the OCP ecosystem, and Lenovo joined in early 2016. Were increasing and managing the ecosystem to provide that. There are going to be multiple players that enterprise will be able to go to, he promises.

Even larger-scale systems like HPEs Cloudline which uses an OCP design rather than the proprietary technology used in ProLiant servers, and is manufactured in partnership with Foxconn rather than by HPE are coming down market, says Fichera. Initially they were very cagey about selling only to really big users but I get the impression theyre coming down now. If you want a couple of thousand units instead of ten thousand, I think they'll talk to you now and they have a lot of competition.

[ How Rackspace will stay alive in cloud: Stop competing with Amazon, start partnering ]

While making data center power usage more efficient has been a big focus in OCP designs, thats one area that isnt directly relevant to enterprises (even though they typically pay more for power than cloud providers do). The biggest improvements require esoteric infrastructure like high-voltage DC power distribution that hyperscale cloud providers have adopted but few enterprises have in their data centers.

The good news is that both general customer demand and the lessons learned from OCP systems mean that server makers and the power supply vendors they all use have made their systems more energy efficient anyway. Systems continue to exhibit big jumps in capacity in terms of performance per server; every generation of the base semiconductors has become more power efficient and both system vendors and CPU vendors have become more focused on power efficiency, says Fichera.

In addition, many enterprises indirectly benefit from power efficiency improvements through cloud usage, Fichera points out. Widely available cloud substitutes for a lot of workloads have reduced the number of people who wake up and have that I have to build a new data center just to add more servers moment.

Even before OCP, the idea of buying barebones servers at scale appealed to some large businesses. Fichera knows of one large financial services company (he identified it only as one of the top 10 financial services organizations in the world) who approached an ODM in Taiwan. They tucked one of their servers under their arm and said Build me this but without this component, without that component, without this management agent; basically, build me a lower-cost version of this server. They bought probably 20,000 of them.

Bell wants OCP to make that kind of custom manufacturing accessible to businesses of all sizes. We want to make it easier to buy and use OCP, and to control your own destiny. With open collaboration, you can get exactly what you want, as opposed to being sold things you dont need.

See the article here:
How open compute cuts server costs in the enterprise - CIO

Read More..

China’s Bitcoin Traders Are Losing Confidence in Exchange Prices – CoinDesk

The decision by China's bitcoin exchanges to freeze withdrawals is impacting the country's over-the-counter (OTC) markets.

As reported by CoinDesk last week, two of China's 'Big Three' bitcoin exchanges abruptly suspended bitcoin withdrawals in response to new pressures from the People's Bank of China, a move that was followed by similar, though less restrictive,policy updates from exchanges across the country.

But while the price was not significantly impacted, traders are reporting that they have had to shift strategies, as they believe exchanges no longer act as a reliable price indicator.

In conversation with CoinDesk, representatives from major OTC firms indicated that they are now largely using the US dollar price (as listed on British Virgin Islands-based Bitfinex) to determine the price of bitcoin.

Zhou Shouji, operator of OTC trading firm FinTech Blockchain Group, for example, said his firm now uses the USD rate, as did OTC trader Zhao Dong, who went so far as to described China's exchanges as "totally disabled".

"Since you can't withdraw, the price is meaningless if you want to trade bitcoin," Zhao said, telling CoinDesk:

"The price on Chinese exchanges is fake price now."

Traders further indicated that activity isincreasingly taking place on peer-to-peer (P2P) exchange platforms including Bitcoinworld and Bitpie, a wallet and P2P service set up by startup Bither.

Those interviewed indicated that they believe such services could become more widely used, especially if further actions are announced by the central bank.

Kong Gao, overseas marketing manager at OTC firm Richfund,described the prevailing sentiment as oneof continued uncertainty.

"I'm just waiting to see what will happen next," he told CoinDesk."I don't think anyone can predict what the PBOC will impose next."

Shredded yuan image via Shutterstock

ChinaPrices

Read more:
China's Bitcoin Traders Are Losing Confidence in Exchange Prices - CoinDesk

Read More..

Peer-to-Peer Bitcoin Trading Surges in Venezuela – Reason (blog)

On February 2, Venezuela's leading bitcoin exchange, SurBitcoin, was forced to suspend operations when its bank account was revoked. According to Rodrigo Souza, who runs SurBitcoin's trading platform, the bank closed the account in anticipation of a nationwide crackdown on bitcoin use in Venezuela after the police raided a warehouse with 11,000 mining computers. SurBitcoin is in talks with other banks, and hopefully it will be operating again soon.

At its core, Bitcoin is a peer-to-peer system that allows users to exchange digital currency without permission from the government or any third party. That's why it's the ultimate libertarian technology. Bitcoin exchanges, like SurBitcoin, however, are subject to government control because they buy and sell bitcoins on behalf of their users and rely on a company bank account to collect and pay out money.

As Souza stressed in an interview last year, exchanges like SurBitcoin aren't actually necessary. They make buying and selling bitcoins more convenient, but users can always revert to peer-to-peer trading. As Souza put it, "how can [the government] stop software running on the internet?"

As he predicted, SurBitcoin's closure has led to a surge in peer-to-peer trading. LocalBitcoins, a site where users connect to buy and sell bitcoins, makes its trade volume public through an API. (See the chart below.) Last week, 464 bitcoins were exchanged in Venezuela on LocalBitcoins, the equivalent of nearly $470,000 dollars based on today's price. That's close to a 50 percent increase in volume since SurBitcoin stopped operating. (LocalBitcoins' previous trading volume peak was 377 bitcoins the week of October 15, 2016, but, at the time, bitcoin was worth almost 40 percent less than it is today.)

SurBitcoin's average weekly trade volume was about 330 bitcoins when it shutdown. So about two-thirds of SurBitcoin's activity has move to LocalBitcoins. (A similar phenomenon is happening in China.)

Why is bitcoin in Venezuela seemingly inexorable? For more, read "The Secret, Dangerous World of Venezuelan Bitcoin Mining." Or listen to this week's episode of EconTalk with Russ Roberts, where I discussed bitcoin in Venezuela and the impact cryptocurrency is having throughout Latin America.

As I told Roberts, I first learned about bitcoin when listening to 2011 EconTalk interview he did with Gavin Andresen, a pioneer in the field. I remember thinking, "this can't possible work." Six years later, in part through my reporting on Venezuela, I'm convinced bitcoin will change the world.

Listen to the interview below. (Bonus link: Nick Gillespie interviewed Russ Roberts in 2014 about his book on Adam Smith.)

Go here to see the original:
Peer-to-Peer Bitcoin Trading Surges in Venezuela - Reason (blog)

Read More..

Top 6 Bitcoin Trading Bots – The Merkle

Trading bots are rather common in the bitcoin world, as very few traders have time to stare at the charts all day. Most people trade bitcoin as a way to generate passive income while working their regular day jobs. With so many people relying on trading bots, the question becomes which one can be trusted and which one should be avoided. Below is a list of known cryptocurrency trading bots, albeit your mileage may vary when using them.

NOTE: The Merkle does not condone the use of trading bots. The Merkle is not responsible for any financial losses sustained while using the software mentioned below. The Merkle is not affiliated with any of these trading bots.

One of the very first automated bitcoin trading bots to ever be created goes by the name of BTC Robot. While it seems to do the job and is quite easy to set up, users mileage may heavily vary when using this tool. Some people seem to be making modest profits, whereas others seem to struggle to get it to work properly. There is a 60-day refund policy, which makes it a no-brainer to try out regardless.

The Gekko trading bot is an open source software solution hat can be found on the GitHub platform. It was last updated a month ago, which seems to indicate it is still being actively developed. Using this automated trading bot seems rather straightforward, as it even comes with some basic strategies. It is not a high-frequency trading bot by any means, nor will it exploit arbitrage opportunities. With a good list of supported exchanges, Gekko could be worth checking out.

One of the more attractive albeit unknown solutions goes by the name of CryptoTrader. The service offers cryptocurrency users automated trading bots running on cloud platforms. Not having to install unknown software is a big plus, albeit it remains to be seen if this platform is legitimate. One intriguing feature is how CryptoTrader features a strategies marketplace where anyone can buy or sell their favorite trading strategy.

Another open-source solution for bitcoin traders goes by the name of Zenbot. Albeit this bot has not seen any major updates over the past few months, it is available to download and modify the code if needed. This marks the third iteration of Zenbot, which is still a lightweight and artificially intelligent bitcoin trading bot. It is also one of the very few solutions capable of high-frequency trading and supporting multiple assets at the same time. According to the GitHub page, Zenbot 3.5.15 makes a 1.531 ROI in just three months, which is quite surprising.

Although technically not a bot in the traditional sense, Tradewave is a platform allowing users to create automated bitcoin trading strategies. Users can connect most of the major exchanges to enable live trading within a few minutes. Moreover, there are quite a few trading strategies shared by community members for other users to try out. Tradewave is not free to use, though, albeit plans start at just US$14 per month.

The Haasbot is somewhat popular among cryptocurrency enthusiasts. On paper, Haasbot does all of the trading legwork on behalf of the user, although some input is required. Haasbot supports all of the major exchanges and is capable of recognizing candlestick patterns. Considering it costs between 0.12 BTC and 0.32 BTC per three-month period to use this tool, one has to be committed to using the software and hopefully make a profit from doing so.

If you liked this article, follow us on Twitter @themerklenews and make sure to subscribe to our newsletter to receive the latest bitcoin, cryptocurrency, and technology news.

Read the rest here:
Top 6 Bitcoin Trading Bots - The Merkle

Read More..

Bitcoin’s Growth in the UK Continues to Be Stifled by Banks – CryptoCoinsNews

Banks in the United Kingdom are turning a deaf ear to bitcoin exchanges, despite the governments pro-blockchain position, according to financial writer Roger Aitken, writing in Forbes. Unless the situation changes, the banks will undermine bitcoins progress and drive cryptocurrency entrepreneurs out of the banking system.

Cryptopay, a bitcoin brokerage, recently informed customers that it will no longer support British Pound deposits and withdrawals on account of new bank policies. Such incidents have increased as bitcoin has gained popularity.

Cancellation of GBP deposit and withdrawal facilities limits people to Single Euro Payments Area (SEPA) transfers, making Cryptopays buying and selling useless to most British customers.

A dozen or more U.K. brokerages and bitcoin exchanges have suffered over the past three to four years as banking facilities have become unavailable. Some have closed or resorted to awkward arrangements.

Britcoin, which became rebranded as Intersango, started in 2011. It faced problems with U.K. bank transfers before eventually closing. An August 2012 update noted that bridging the gap between bitcoin and the conventional banking system was costly on account of technical issues, missing transfers, and accounts frozen and closed without warning.

In 2014, Bit121 had a promising start, but banks withdrew their support and the exchange closed.

In Bitcoin We Trust suffered the same fate. It resorted to using postal orders before giving up.

Coinfloor, one of the only U.K. exchanges still operating, uses SWIFT transfers, which incur hefty costs and delays. The minimum transfer is 1,000 (c.$1,250).

CoinJournal, a bitcoin publication, saw its banking services come to an abrupt endafter its U.K. banking provider Barclays terminated its business account. CoinJournal received no official warnings prior to its account closure. Even more alarmingly, Barclays still hasnt given a reason for the extreme action.

CoinJournal believes the decision taken by Barclays to close its business account was an automated call, after seeing a pattern of banking transactions involving prominent bitcoin exchange and service provider Circle.

The decision was likely a result of us using Circle to transfer fiat from ad revenue into bitcoin to pay our writers and some overheads, a representative for the publication told CCN.

Similar scenarios have played out in Australia and New Zealand.

BitNZ, a New Zealand bitcoin exchange, has announced it is closing due to the refusal of New Zealand banks to allow bank accounts to trade bitcoins, and has advised customers to withdraw all funds before April 15, 2017.

The Australian Competition and Consumer Commission is scrutinizing attempts by Australias biggest banks to swallow fintech companies developing technologies like blockchain solutions in the financial sector.

Peer-to-peer services match individual buyers and sellers in the U.K. in lieu of traditional exchanges. Trust is established by reputation.

Once a buyer has paid, usually with a bank transfer the seller sends the bitcoins.

As for other nations, Russia recently relaxed its regulatory position and taken a wait and see approach. It has effectively legalized bitcoin and allowed for exchanges to operate. Switzerland is a more progressive country. It is easy to buy bitcoins through a network of ATMs on the rail system.

In Japan, it is possible to pay electric bills with bitcoin.

The United States has a more complex regulatory framework. But progress is on the horizon since the New York BitLicense took effect in 2015, with other states following a similar approach.

Bitcoin is legal in China, although the central bank recently stopped highly leveraged trading.

The banking sector is clearly at odds with the U.K. government, which is openly pro-blockchain. The situation is peculiar, with the government saying the country is open to bitcoin but the banking sector standing in the way.

Since the financial crisis, the taxpayer has become the majority shareholder in the Royal Bank of Scotland, holding at around 82% of the bank. This would normallytranslate into a certain amount of leverage by the taxpayer.

The U.K. also has a reputation for being a fintech hub, to which the banking sector seems to have taken exception.

For whatever reason, the banks have closed ranks and chosen not to work with bitcoin.

The fact that bitcoin is decentralized and fiat currency is centralized could be at the root of the conflict.

Also read: Blockchain platform Waves raises more than $2m at the start of the crowdsale campaign

Money cannot flow easily from the blockchain economy to the traditional financial sector and vice versa without banks cooperation. The bitcoin sector is not large enough to offer all the goods and services needed to make bitcoin a sufficiently broad means of payment.

Bitcoins volatility also makes it an unsuitable unit of account or store of value. While its a great transfer medium, its price against fiat fluctuates too much for most people.

The bitcoin economy wont expand until bitcoin is better suited as a means of payment. But it wont be better suited without more growth and stability.

Waves, a custom blockchain tokens platform, offers a solution fiat-backed blockchain tokens. It raised $16 million last summer through crowdfunding. Waves can act as a gateway between the blockchain and the fiat world.

Customers pay money into the gateway using a bank transfer or another suitable means, and the gateway issues them the same sum in blockchain tokens

The same exchange occurs in reverse when customers cash out their Waves GBP and have them sent as real GBP to their bank account. Waves essentially serves as a toolkit.

Sasha Ivanov, CEO and founder of Waves, noted that Waves can make money more efficient. By putting fiat money on the blockchain, Waves can make it more transparent and faster, and it can reduce the cost of sending it abroad.

Ivanov thinks Waves can introduce competition and encourage banks to become more accountable. If banks in one sector in one country wont work with Waves, it will work with those in another jurisdiction.

Waves does not immediately solve the problem of U.K. banks hostility to bitcoin, but it suggests the roadblocks are not insurmountable. The answer may be to work around them rather than with them.

Image from Shutterstock.

Read the original post:
Bitcoin's Growth in the UK Continues to Be Stifled by Banks - CryptoCoinsNews

Read More..

Banking Giant Mizuho Invests in Japan’s Biggest Bitcoin Exchange … – CoinDesk

Japan's largest bitcoin exchange by volume has announced a new round of fundraising.

In a press release today, Tokyo-based Bitflyer revealed it had raised new capital from financial firms including Sumitomo Mitsui, Mizuho Financial Group and Dai-ichi Life Insurance Company. A report by Japan-based news source Nikkei indicated that the funding equalled roughly 200m ($1.76m).

According to statements from the startup, the fundraising follows BitFlyer's work with Sumitomo and Mizuho on private blockchain solutions aimed at providing the core infrastructure for enterprise use cases.

Notably, the funding also comes during what could be a period of increasing competition for the startup. As acknowledged in its release, legislation is slated to become law in June that would enable regulated firms to enter Japan's bitcoin and cryptocurrency markets.

As reported by CoinDesk, the move is expected by many in the blockchain industry to open the doors for existing financial firms to begin offering such services.

Still, BitFlyer is expected to be competitive in the market, boasting more than $35m in funding, the most recent of which was a $27m Series C. Disclosure: CoinDesk is a subsidiary of Digital Currency Group, which has an ownership stake in BitFlyer.

BitFlyer image via Facebook

bitFlyerFundingInvestorsJapan

See more here:
Banking Giant Mizuho Invests in Japan's Biggest Bitcoin Exchange ... - CoinDesk

Read More..