Page 4,435«..1020..4,4344,4354,4364,437..4,4404,450..»

Exploring the cloud laboratory: biotechnology and cloud computing – Information Age

Cloud computing within the biotech sector can take big data analysis to the next level by means of performance, connectivity, on-demand infrastructure and flexible provisioning

The continued drive for innovation puts immense pressure on IT departments to develop new technologies at speed, while also making sure that they do this cost effectively.

Add to this the fact that, more so than other industries, biotech firms are extremely tightly regulated. As a result, IT groups within this industry are often reluctant to introduce more complexity into what is already a very complex environment.

To them, expanding a data centre can often feel a whole lot easier than navigating the regulations of the cloud. Despite this, growth in the demand for cloud computing in life sciences research and development is escalating due to the benefits it brings to the industry benefits like exceeding regulatory requirements, for example.

iland haveworked with many companies in the healthcare, life sciences and biotech industries. Therefore, it knowsfrom experience that the implementation of cloud computing in biotechnology empowers organisations with the control and flexibility needed to lead the way in both the research world as well as the businesses world.

>See also:10 trends that will influence cloud computing in 2017

For example, ilandrecently worked with a US based biotechnology organisation on their backup and disaster recovery (DR) strategy, and were able to drive global data centre consolidation with host-based replication to the cloud. As a result, itsDR testing and auditing processes were greatly simplified and streamlined which drove significant cost savings as well as compliance assurance.

If you still need convincing here are three additional key benefits that cloud brings to biotech organisations.

When the Human Genome Project began it was one of the most extensive research projects in the field to date costing billions of pounds and lasting over a decade.

These days, thanks largely to cloud technology, it can be done in just 26 hours. Things such as drug R&D, clinical research as well as a whole host of other areas have benefited just as much from the rapid growth of computational power. The better your technology is at crunching huge sets of data, the quicker you can innovate.

Cloud computing within the biotech sector can take big data analysis to the next level by means of performance, connectivity, on-demand infrastructure and flexible provisioning.

Labs can also benefit from immense computing power without the cost and complexity of running big onsite server rooms. They can also scale up at will in order to make use of new research and ideas almost instantly.

Concerns have been voiced that so called scientific computing in the cloud may make results less reproducible. One concern is that cloud computing will be a computing black box that obscures details needed to accurately interpret the results of computational analyses.

>See also:How cloud computing can transform the pharmaceutical industry

In actual fact, by leveraging the application program interfaces (APIs) in the iland cloud, biotech customers are able to integrate cloud data back into on-premises IT systems to ensure that data analyses done in the cloud can be easily shared and consumed by other applications.

Essentially, cloud computing services bring more players to the table to solve the giant puzzle. Its a win-win situation from an economic and patient standpoint, and several big name companies are jumping on the biotech cloud bandwagon.

Biotech companies need to maintain strong access and authentication controls, while also being able to collaborate easily. For this reason audit trails and other measures are often required to verify that information has not been improperly altered, and that good experimental and manufacturing procedures have been followed.

At the same time bio-technologists need to be able to access and share data across multiple departments or even multiple companies.

Cloud computing in biotechnology makes this all possible it centralises data, ensuring security and data sovereignty while facilitating collaboration.

It supports extensive user and role based access control, two-factor authentication and integrity monitoring to prevent improper access and changes. In addition to data encryption, vulnerability scanning and intrusion detection, these measures facilitate security and compliance, without disrupting the internal workflow.

Complex regulatory requirements and logistics combined with niche markets make efficiency paramount within biotechnology. Even minor mistakes as a result of sloppy process management can easily result in major issues.

Real-time operational reporting dramatically improves efficiency, quality control and decision making, allowing organisations to react instantly to challenges and opportunities, both internal and external.

>See also:Managed cloud: making the most out of public cloud computing

As well as enhanced billing visibility and resource management functions, the release of the vendors secure cloud services means that the itscloud now includes on-demand security and compliance reports.

This advanced cloud management functionality is designed to foster strategic, self-sufficient control of a cloud environment, optimising overall cloud usage and costs to drive business initiatives and growth.

Without a shadow of a doubt, cloud technology can help biotechnology companies build the future. From research and development to marketing, computing affects everything an organisation does.

Sourced by Monica Brink, director of marketing, iland

Read this article:
Exploring the cloud laboratory: biotechnology and cloud computing - Information Age

Read More..

Step aside cloud computing, ‘fog computing’ is here – TechRadar

Cloud computing performing tasks on external servers and sending the results back to your machine is rather popular, it's safe to say.

Going all the way back to the 1970s, firms have relied on remotely located computing clusters to solve hardware-intensive problems. Today, it's the dominant way of doing business for many companies.

But there's a problem. While hosting computer power on a remote server is handy, it's kind of a security risk; if that server is compromised, then the data of thousands even millions of users can be exposed in one fell swoop.

So computer scientists at the University of Camerino have developed a new concept that spreads data across many servers, with no single location. It uses virtual buffers to endlessly relocate data packets without a file ever being complete in one place. They're calling it 'fog computing'.

"Our proposal is based on this idea of a service which renders information completely immaterial in the sense that for a given period of time there is no place on earth that contains information complete in its entirety," said Rosario Culmone and Maria Concetta De Vivo, who developed the concept.

"We want to realise a fog of files rather than a cloud."

They compare the system to a letter with a secure tracking device that gets sent continuously between different post offices and is never delivered. It would be hard for a thief to find it, but when you need it again you simply activate the tracking device, and it can then be easily located.

"The inaccessibility by others, the non-location, and the dynamics of the system offer significant advantages in terms of security, but raise some legal problems. the pair said.

The full details of Culmone and De Vivo's "fog computing" system were published in the International Journal of Electronic Security and Digital Forensics.

Follow this link:
Step aside cloud computing, 'fog computing' is here - TechRadar

Read More..

IBM and Wanda form China cloud computing partnership – Reuters

SHANGHAI IBM and a unit of China's Dalian Wanda Group, a property and entertainment conglomerate, agreed on Sunday to team up to provide cloud services to Chinese companies, the U.S. technology provider said.

Through a newly formed venture, Wanda Cloud Company, they will offer select IBM cloud infrastructure and platform as a service (IaaS and PaaS) technologies in China, IBM said.

The venture between IBM and Wanda Internet Technology Group will be "responsible for distributing, building and operating the IBM cloud platform in China", an IBM spokeswoman said.

In November, Alibaba Holdings Ltd's cloud unit announced plans to open four new data facilities outside China in a bid to grab market share from leading players Amazon.com and Microsoft.

Research firm Canalys expects the global market for cloud computing, defined as the storage of data on remote networks rather than local servers, to reach $135 billion by 2020.

(Reporting by Alexandra Harney; editing by Jason Neely)

JERUSALEM/BEIJING China's economy may be 35 times larger than Israel's, but Prime Minister Benjamin Netanyahu is hoping to use that to advantage during a three-day visit to Beijing as he looks to reorient Israel's economy toward Asia over Europe and the United States.

MUMBAI Britain's Vodafone Group and Idea Cellular agreed on Monday to merge their Indian operations in a $23 billion deal, creating the country's biggest telecoms business after the entry of a new rival sparked a brutal price war.

TAIPEI Taiwan's TSMC, the world's largest contract chipmaker and a major Apple Inc supplier, on Monday said it would take a decision next year on whether to build a U.S. chip plant.

Visit link:
IBM and Wanda form China cloud computing partnership - Reuters

Read More..

IBM and Red Hat aim to boost hybrid cloud computing, OpenStack usage – ZDNet

IBM has signed two alliances aimed at encouraging use of hybrid cloud computing.

IBM and Red Hat are working together to encourage the use of OpenStack and make it easier for to companies to shift their Linux workloads into private clouds.

IBM said that Red Hat OpenStack Platform and Red Hat Ceph Storage on IBM Private Cloud will be generally available at the end of March, ahead of which IBM has become a Red Hat Certified Cloud and Service Provider. Big Blue said this would help "enterprises benefit from the OpenStack Platform's speed and economics".

Also as part of the agreement, Red Hat Cloud Access will become available for IBM Cloud by the end of the second quarter, allowing Red Hat customers to move unused Red Hat Enterprise Linux subscriptions from their data centers into IBM Cloud data centers worldwide.

Red Hat Cloud Access allows Linux customers to retain services and support while moving workloads into the cloud.

"Our collaboration with IBM is aimed at helping enterprise customers more quickly and easily embrace hybrid cloud," said Radhesh Balakrishnan, general manager of OpenStack at Red Hat. "Now, customers who don't have in-house expertise to manage an OpenStack infrastructure can more confidently consume Red Hat OpenStack Platform and Red Hat Ceph Storage on IBM Private Cloud."

IBM and Red Hat said they will provide the hybrid cloud infrastructure to help customers more efficiently run cloud applications using OpenStack APIs. Customers will be able to provision cloud infrastructure faster and, using Red Hat Cloud Access, migrate existing workloads and Red Hat subscriptions to IBM Cloud, or use the software and infrastructure on a pay-as-you-go basis.

IBM and Red Hat said they will jointly sell new offerings for private cloud deployments, including workload migrations, disaster recovery, capacity expansion and data center consolidation.

IBM has also signed a deal with Veritas, also around hybrid cloud, to help enterprises working with increasing data volumes better manage, optimize and protect data across hybrid cloud environments.

Veritas has certified the IBM Cloud Object Storage family of software and cloud services for use with Veritas NetBackup 8.0, making it easier for customers to migrate data from on-premises systems to the cloud for greater storage capabilities.

In turn, IBM has certified NetBackup 8.0 to run on the IBM Cloud to offer clients additional data protection for cloud-based workloads. NetBackup 8.0 is due to be available in the second quarter and will be available for order from the IBM Bluemix Catalog of services.

Cloud computing: Are these the hurdles that trip you up?

Excerpt from:
IBM and Red Hat aim to boost hybrid cloud computing, OpenStack usage - ZDNet

Read More..

Words to go: Containers in cloud computing – SearchCloudComputing – TechTarget

With the popularity of containers in cloud computing, more organizations are choosing to not think outside the box.

Containers have been around for a while, but Docker recently helped bring them into the spotlight for enterprise use. As the cloud evolves, more enterprises see the benefits of adopting hybrid and multicloud models, but it's a challenge to ensure software runs reliably when moved from one environment to another. Containers have come to the rescue by wrapping an application and all its components into a single, more portable package.

And, with the growing popularity of containers in cloud computing, major providers, including Amazon Web Services (AWS), Azure and Google, offer container services and orchestration tools to manage container creation and deployment.

But before you start to box up your processes, get familiar with these key terms for containers:

Application containerization: Containerization -- also known as container-based virtualization -- is a distributed application deployment model that offers an alternative to launching a VM for each application. Isolated environments, called containers, instead run on a single control host and share the same OS kernel. Compared to traditional virtualization, containerization allows for greater efficiency for memory, CPU and storage.

Because organizations can create containers more quickly than they could hypervisor-based instances, containers also encourage a more agile environment, promoting continuous delivery methods as well as the use of microservices.

Containers as a service (CaaS): Cloud providers, such as AWS, Azure and Google, offer container-based virtualization as a service, providing orchestration, management and compute resources for containerized applications. This model, known as CaaS, typically consists of a Linux-based OS, a container runtime, a container orchestration tool and a container registry.

Docker: Docker is an open source platform that enables the deployment of Linux applications as containers. Docker containers can run in multiple locations including VMs, bare-metal servers, and public cloud instances. Major cloud providers, including AWS, Azure and Google, support Docker containers.

Google Kubernetes: Kubernetes is Google's container management system for public, private and hybrid clouds. The open source system manages deployment and scaling for Docker and Rocket containers, and also has tools for orchestration and load balancing tools. Like they support Docker, AWS, Google and Azure support the Kubernetes system. Developers can move Kubernetes container workloads across cloud platforms without having to alter the code.

Google Container Engine (GKE): GKE is an orchestration and cluster management system for Docker containers in cloud computing. The clusters consist of a group of Google Compute Engine instances that run Kubernetes. GKE provides access to the Google Container Registry to store and access private Docker images. Stackdriver Logging and Stackdriver Monitoring are also available to monitor the health of the applications.

Microsoft Windows Containers: Microsoft Windows Containers are isolated container environments in Windows Server 2016. Two types of containers are supported: Windows Server Containers and Hyper-V Containers. Organizations can manage these containers with Docker, PowerShell or Azure, and integrate them with existing Windows technologies.

Azure Container Service (ACS): ACS is an open source orchestration system based on Apache Mesos that can manage fleets of containers in cloud computing. The service supports DC/OS, Docker Swarm and Kubernetes to scale and orchestrate. Because of open source components in the orchestration layers, applications are fully portable.

Amazon EC2 Container Service (ECS): Amazon ECS is a container management service that supports Docker containers and runs applications on a managed cluster of Amazon EC2 instances. Users create and manage Docker containers through a set of APIs. The service evaluates and monitors CPU usage and supports its own open source scheduler, AWS Blox, as well as third-party schedulers. Users also have access to other Amazon EC2 features through API calls, such as Elastic Load Balancing, security groups and identity and access management roles. Amazon EC2 Container Registry is integrated with Amazon ECS to manage, store and deploy Docker container images.

Open Container Initiative (OCI): OCI is a collaborative project to establish common standards for containers. The project is hosted under the Linux Foundation, and is independent from commercial organizations. Currently, OCI has two specifications: the Runtime Specification and the Image Specification. Members of the OCI include vendors such as AWS, Google, Microsoft and IBM.

Five drawbacks of using container technology

Explore the relationship between containers and cloud

Discover the benefits and challenges of CaaS

See the rest here:
Words to go: Containers in cloud computing - SearchCloudComputing - TechTarget

Read More..

Star2Star Wins 2016 Cloud Computing Excellence Award – Marketwired (press release)

SARASOTA, FL--(Marketwired - March 20, 2017) - Star2Star Communications, the World's Leading Hybrid Cloud Communications Solution, today announced that the company is a winner of the 2016 Cloud Computing Excellence Award.

The Cloud Computing Excellence Award, presented by Cloud Computing magazine, honors vendors who have most effectively leveraged cloud computing in their efforts to bring new, differentiated offerings to market.

"We are the pioneers of hybrid cloud communications, and we're pleased that the industry is acknowledging it as a superior architecture" said David Portnowitz, Chief Marketing Officer at Star2Star. "We started off ahead of the curve with our innovative hybrid cloud-based unified communications solution, and we've never looked back," he continued.

Star2Star launched the concept of Hybrid Architecture more than ten years ago. The Star2Star architecture combines the best aspects of both cloud-based and premise-based Unified Communications systems. Since its introduction, the company has continually improved and expanded their product portfolio as part of their complete, end-to-end solution.

About Star2Star:

Founded in 2006 in Sarasota, Florida, Star2Star Communications delivers the World's Leading Hybrid Cloud Communications Solution. Star2Star's award-winning, patented Constellation Network overcomes the reliability and quality limitations of other communications technologies.

Star2Star unifies customers' voice, video, fax, instant messaging, and presence management into a single, easy-to-use system. The company has one of the highest long-term customer retention rates in the industry at 99.85% and has been recently recognized by numerous leading analysts.

In the past six years, Star2Star was named to the Forbes Most Promising Companies list, the Deloitte Technology Fast 500 five times, and the Inc. 500|5000 six times. The company was also named by Infonetics as a Top 10 Hosted Business VoIP/UC Provider, as well as in the Gartner Magic Quadrant for Unified Communications as a Service, Worldwide.

Star2Star Communications solutions are sold globally through a diversified network of partners that includes distributors, master agents, managed service providers and certified installing dealers. Star2Star systems are used by hundreds of thousands of business users, including many large national chains with multi-location footprints.

Image Available: http://www.marketwire.com/library/MwGo/2017/3/15/11G133205/Images/cloud-comm-excel-award-2016-73a283f6d191bdd19058cdaea800473e.jpg

Read the original post:
Star2Star Wins 2016 Cloud Computing Excellence Award - Marketwired (press release)

Read More..

Experimenting with Bitcoin on Your Honeymoon – CryptoCoinsNews

A couples honeymoon might not seem the time to undertake a social experiment, with the digital currency Bitcoin, but thats exactly what one soon-to-be-married couple are planning when they tie the knot at the end of the month.

On 28 March, 2017, South African residents Wilhelm Lehmann and Natasha Bezuidenhout are due to get married and just like most newlyweds theyll be enjoying their time together on their honeymoon.

For them, this is the perfect time to demonstrate the real-life value of Bitcoin.

Speaking to CCN, Lehmann, a telecoms and IT specialist, who has used Bitcoin extensively and whose hobbies include cryptography and firearms, said that he has been following Bitcoin since its inception, concluding that it might be possible to live solely using the currency. He adds that this is a challenge he wants to prove after the honeymoon challenge.

He said that if the experiment is successful it will further boost his view of it as a real-world currency, which he hopes to explain in his whitepaper after the honeymoon.

It would prove you can travel to multiple countries using a single currency stored virtually in the cloud, so you dont have to worry about losing credit cards [or] cash being stolen.

The couple is planning to travel for a week to Zanzibar, Mauritius and South Africa. Using Bitcoin Core, Electrum and paper wallets, Lehmann says that he plans on pre-booking everything, choosing vendors that use payment processors that accept the digital currency.

However, while many may consider this too short a time to determine the currencys value in day-to-day expenses, Lehmann says how much they can do depends on the availability of Bitcoin that they are hoping friends and family will put toward their honeymoon fund.

He states:

Currently our main challenge is getting enough Bitcoin together to pay for our honeymoon.

Bitcoin is one of the quickest growing digital currencies with a market cap value of just over $20 billion. Ethereum, in second place, is valued at just under $3 billion.

With one Bitcoin currently valued around $1,248, a slight drop from its previous high of $1,350, after the SEC rejected the approval of the first ETF, the currency is demonstrating that despite setbacks it is much stronger than previously thought.

Not only that, but the currency is illustrating its ease of use, which millions of people are experiencing.

Lehmann states:

People are moving to convenience, and online transactions are more common every day.

He adds that as a backup hell be taking his credit cards with him as he says he cant let our honeymoon be spoiled because the spa wont take Bitcoin for a massage. He concludes, though, that this would be a last resort if there was no way of getting what they needed with the currency.

Featured image from Shutterstock.

Read more from the original source:
Experimenting with Bitcoin on Your Honeymoon - CryptoCoinsNews

Read More..

Jihan Wu Wants To Accelerate The Bitcoin Unlimited Hard Fork Regardless of Community Sentiment – newsBTC

Last but not least, speeding up the BU hard fork can be seen as a replay attack against the network.

Jihan Wu is known for expressing strong opinions on bitcoin and cryptocurrency. Ever since he pledged support to Bitcoin Unlimited, though, he has come a bit unhinged. In a recent tweet, he proposes to speed up the BU hard fork. The reason for doing so is the future contract on Bitfinex which is seemingly offensive towards big block supporters.

It is evident there is something brewing in the Bitcoin Unlimited community. A lot of recent statements have caused quite a stir, and things dont improve anytime soon. Accelerating the hard fork is the last thing bitcoin needs right now, yet that is exactly what Jihan Wu proposes. His reasons for stating such things are rather odd, though. Moreover, this tweet indicates how desperate the BU supporters are getting right now.

The hard fork in question is met with a lot of resistance from bitcoin enthusiasts. The decision to introduce such a hard fork is utterly reckless. Moreover, support for these plans is less strong as some people like to think. This does not bode well for BUs chances of success either. Accelerating one of the most controversial decisions in bitcoins history is not the right course of action. At the same time, it highlights the fanaticism found within Bitcoin Unlimited right now.

There is a lot more at stake than just creating two separate bitcoin solutions, though. If a hard fork of this kind occurs, trust in cryptocurrency will be reduced to an all-time low. In fact, economists feel it may cripple bitcoin as a whole and push down the price to pre-2013 levels once again. It is evident the BU supporters do not care about this one bit, and Jihan Wu is certainly not concerned about the consequences right now.

It is evident the market does not want a hard fork, nor does the majority of the bitcoin community. Ever since BU started uttering threats to the bitcoin network, the BTC price has fallen sharply. Since BU is supported by a lot of early bitcoin investors, they could care less about losing some portfolio value right now. It doesnt matter to them how these plans affect the rest of the community. If the hard fork tanks the price, people like Jihan Wu will not lose any sleep over it.

Last but not least, speeding up the BU hard fork can be seen as a replay attack against the network. If Unlimited wants a fair chance at succeeding, accelerating the fork is the last thing they need to do. It seems there a cult-like mentality among BU supporters which makes them say irrational things. Unfortunately, such rogue elements can impact the entire bitcoin ecosystem with relative ease. Some enthusiasts want BU to hard fork right now so this entire debacle becomes a non-to issue, albeit it is doubtful things will play out that way.

Header image courtesy of Shutterstock

Read more:
Jihan Wu Wants To Accelerate The Bitcoin Unlimited Hard Fork Regardless of Community Sentiment - newsBTC

Read More..

Bitcoin Price Weekly Analysis BTC/USD To Escalate Declines – newsBTC

Bitcoin price has started a downside move after failing near $1180 on many occasions against the US Dollar, and BTC/USD now could extend losses.

Bitcoin price has started a downside move after failing near $1180 on many occasions against the US Dollar, and BTC/USD now could extend losses.

This past week we saw a decline in BTC price due to the U.S. Securities and Exchange Commissions verdict on the Bitcoin ETF request. The price moved down, but was seen later recovering above $1050. However, there were many attempts by the price to settle above $1200. It failed every time to settle above the $1180 and $1200 levels. As a result, a decline was initiated in Bitcoin price. It moved down below $1050, and settled below the stated level.

I think the most important close was below $1100. It is a major pivot, and a crucial support. So, a close below $1100 is a bearish sign, and might call for more losses in the near term. The price eve traded below $1000 recently, and currently recovering. On the upside, an initial resistance is near 38.2% Fib retracement level of the last decline from the $1172 high to $936 low.

There is also a bearish line on the 4-hours chart (data feed from SimpleFX) of BTC/USD with resistance near $1100. The most important hurdle is near the 50% Fib retracement level of the last decline from the $1172 high to $936 low. So, selling rallies may be considered in the near term.

Looking at the technical indicators:

4-hours MACD The MACD is now well into the bearish slope.

4-hours RSI (Relative Strength Index) The RSI is well below the 50 level, and suggesting bearish signs.

Major Support Level $980

Major Resistance Level $1075

Charts courtesy SimpleFX

Here is the original post:
Bitcoin Price Weekly Analysis BTC/USD To Escalate Declines - newsBTC

Read More..

Chinese Regulators Propose Rules for Bitcoin Exchanges: Report – CryptoCoinsNews

The Peoples Bank of China (PBOC) has proposed revisions to its anti-money laundering (AML) requirements for bitcoin exchanges, according to Zhou Xuedong, director of the National Peoples Congress and the banks department of business administration, according to Finance New Media.

Despite recent actions with virtual currency exchanges, the central bank has not laid out any clear set of regulations upon the bitcoin industry in the country.

Xuedong has called for China to look at regulatory efforts toward bitcoin internationally to establish a regulatory mechanism for the industry.

Under the rules established in 2013, there are no know your customer (KYC) requirements. There was a requirement concerning bit currency risk for trading platforms to identify users identities, including real name, registration name, identity card numbers and other information.

The proposed change includes two aspects regarding AML. One is to establish an AML structure to improve AML, anti-terrorism financing and an internal control system. Another is to clarify the platforms obligations, including prevention and control measures. Such measures would include customer identity information, a system for preserving transaction records and a system for recording suspicious transactions.

Virtual platforms will need to have on-site certification. The trading platform will have to follow the KYC principle and establish a system for customer identification. When users of virtual currency make withdrawals and redemptions, they will be required to have a valid identity document. Customers applying for counterfeit goods, virtual goods and the equivalent of more than 50,000 yuan should provide remote video certification.

The customer for the first time would be required to have presence certification.

In addition to making detailed requests for customer identification, the proposal stipulates that the platform should develop senior management personnel responsible for AML and anti-terrorism financing, and establish specialized agencies and positions and an internal control system.

The platform should properly store customer identity information and transactions to ensure complete and accurate reproduction of each transaction.

The proposal also includes a list of suspicious transactions to focus on. Should a transaction involve criminal activities like money laundering, the platform should report it to the central bank business management department.

Xuedong said the whereabouts of funds can be learned from the blockchain technology.

Also read: Report: China should clarify bottom line of bitcoin exchange supervision

Since the beginning of January 2017, the Peoples Bank of Beijing, Shanghai and other branches of the joint local financial regulatory authorities have carried out on-site inspections of exchanges that found the AML internal control system is not perfect.

On January 11, the Shanghai and Beijing branches of the PBOC conducted on-site checks of three bitcoin exchanges, sending prices crashing. Exchanges in the country began enforcing sweeping changes, beginning with halting leveraged or loan-based trading of the cryptocurrency between platforms a week after the PBOC became involved. On January 24, Chinese exchanges ended zero-fee trading and began charging a flat 0.2 percent fee of the value of the transaction.

The central bank continued its investigation of Chinese bitcoin exchanges in February. A closed door meeting with a number of domestic exchanges took place on February 8.

On February 9, a notice was issued to upgrade the AML system and that the standard development and implementation period was expected to be one month.

On March 8, anotice mentioned the need to address AML, foreign exchange management, payment settlement and other financial regulations.

Featured image from Shutterstock.

More here:
Chinese Regulators Propose Rules for Bitcoin Exchanges: Report - CryptoCoinsNews

Read More..