Page 4,476«..1020..4,4754,4764,4774,478..4,4904,500..»

BoT expedites cloud computing adoption | Bangkok Post: tech – Bangkok Post

The adoption of public cloud computing in the banking and financial sectors will move at a faster pace after the Bank of Thailand's recent notification allowing the use of cloud in IT outsourcing services.

The notification, effective from Jan 31, indicates the central bank's awareness of the benefits of cloud-based services such as scalability and advanced functionalities for financial institutions.

"Thailand's central bank is one of five in Asia-Pacific -- the others being its peers in Singapore, Indonesia, Australia and South Korea -- that can adapt quickly to rapid changes in technological innovations and assist the local banking and financial sectors," said Andrew Cooke, regional director for legal affairs for Asia Pacific & Japan at Microsoft Operations Pte.

The adoption of cloud technology in the financial sector is still fairly limited because of security and data privacy concerns and regulations.

Using cloud technology in critical information systems requires the central bank's approval 30 days before deployment.

The central bank had not previously clarified the use of cloud in the financial sector.

"The notification can promote confidence and security among financial institutions in Thailand," said Mr Cooke.

To capitalise on this opportunity, he said Microsoft has issued guidelines for the use of cloud services by financial institutions in Thailand.

Lesly Goh, the financial service industry lead for Asia Pacific at Microsoft Operations, said Thailand's financial service market has a lot of potential as the country has large regional and local banks intending to adopt advanced digital technology to transform their services.

Microsoft mayprovide artificial intelligence techniques used in security monitoring systems for greater effectiveness, as well as detecting new ATM malware and other e-payment activities to support the Thai government's national e-payment scheme.

Ms Goh said Microsoft is also encouraging banks in Thailand to adopt blockchain technology in common services like digital identity, supply chain and trade finance.

Keshav Dhakad, regional director for the digital crime unit at Microsoft Operations, said the financial sector is one of the top traditional targets for cybercrime. "Thailand ranked among the top five countries in Asia Pacific in 2016 with the highest number of malware infections," he said.

Trojan, worms and viruses are three most common malware categories in Thailand.

Nearly 25% of computers in Thailand are infected with a Trojan -- higher than the global average of 10%.

In addition, Mr Dhakad said the average amount of time organisations in Asia Pacific needed to detect cyber threats hidden in their systems is longer than 500 days, compared with 140 days globally.

Mr Dhakad attributed the high infection rate in the country to the lack of cybersecurity risk assessment measures and the use of pirated software or counterfeit software packages.

Get full Bangkok Post printed newspaper experience on your digital devices with Bangkok Post e-newspaper. Try it out, it's totally free for 7 days.

Read the rest here:
BoT expedites cloud computing adoption | Bangkok Post: tech - Bangkok Post

Read More..

The spotlight falls on data centre resiliency as data keeps growing – Cloud Tech

(c)iStock.com/Koldunov

Its hard to imagine the sheer scale of data created on a daily basis. In just one second 747 Instagram photos are uploaded, 7,380 tweets are sent and 38,864GB of traffic is processed across the internet. This equates to 2.5 quintillion bytes of data every single day. With IoT connections booming, this number is expected to keep growing exponentially. Almost all this data travels through a data centre at some point.

Businesses are handing the hosting and management of their infrastructure and systems software to third-party data centre operators, a move that has enabled companies of all sizes to become more agile and cost-conscious.

This phenomenon is projected to grow, with 86% of workloads expected to be processed by cloud data centres by 2019, and only 14% by traditional data centres. Perhaps even more striking is that the same forecast indicates 83% of data centre traffic will be cloud traffic in the next three years.

This explosion in data, cloud applications, services and infrastructure has brought about a change in data centre usage which in turn has also demanded a change in physical facilities.

It is essential four features are woven into the design and functionality of every data centre - scalability, availability, resiliency and security. Outsourced data centre owners must be able to handle a surge in demand without adequate capacity and environmental monitoring servers can quickly become overworked and cause outages.

In addition, data centres have to demonstrate resiliency in order to reassure their customers. Corporate enterprises, particularly those who have migrated to hybrid environments, live in fear of an outage and the resulting impact on costs and reputation. And with good reason.

In autumn 2015 a data centre owned by Fujitsu suffered a power outage which took down a number of cloud services. This was not a short-lived problem; the effects persisted for some time and affected customers on the Fujitsu public cloud and its private hosted cloud as well as other infrastructure services.

As with Fujitsu, data centre and service availability can be disrupted in many ways. Power supply failure is one of the biggest causes, as are cyber-attacks, but data centres can also be affected by overheating if efficient cooling is not in place, or even by extreme weather incidents. Examples vary from the mundane to the unbelievably absurd.

Despite the risks of failure, few of the listed scenarios actually have to result in downtime if there is a good understanding of the data centre environment, a suitable level of real-time operational intelligence and procedures are in place to identify issues before they can lead to disaster or failure.

Sophisticated solutions are available to provide real-time insight, control and predictability that help data centre managers to deal with environmental and operational challenges. Environmental conditions can be monitored constantly for any potential issues, and assets tracked and managed to maintain their performance and guard against technical breakdown.

As data continues to grow and cloud traffic increases, utilising intuitive insight and fit-for-purpose tools such as those described above will help data centres and their operators to maintain resilience, ensure uptime and support their customers as they move away from internally managed IT estates.

More here:
The spotlight falls on data centre resiliency as data keeps growing - Cloud Tech

Read More..

Vapor IO launches edge micro data centre for telcos – Global Telecoms Business

Copying and distributing are prohibited without permission of the publisher Vapor IO launches edge micro data centre for telcos

22 February 2017

Vapor IO has launched Vapor Edge for Telecom, a complete 5G-ready micro data centre for telco edge computing

Read more: Vapor IO micro data center edge data centre

VaporIO, the hybrid and edge cloud specialist, has launchedVaporEdge for Telecom, a complete platform for edge computing. The solution is a self-contained micro data centre in a wireless base station designed as 5G-ready.

Cole Crawford, CEO and founder of Vapor IO, talking toCapacitysaid: The telcos cannot defeat the laws of physics - they can only get so much speed back to the data centre - we are bringing the data centre right to the edge of the network where they need it. You and I have talked extensively before about the fact mobile operators and landowners are in an ideal position to capitalise on the emerging need for low-latency edge-computing. They own the key infrastructure, including tens of thousands of remote tower and base station locations with power and high bandwidth backhaul.

"These locations are ideal forVaporEdge technology, and Vapor Edge can help carriers upgrade their infrastructure and business models as they move to a fullyvirtualisedinfrastructure with edge computing and 5Gcapabilities Crawford said.

VaporEdge is a complete platform for edge computing that provides wireless telecom companies with a simple way to deploy and manage cloud servers that are co-located with their base station equipment.

This makes it easy for carriers and wireless base-station landowners to offer cloud compute capabilities in close proximity to the Radio Access Network (RAN), enabling new low-latency applications and creating new business models for these players as they create partnerships with public cloud providers, web-scale companies, and other over-the-top (OTT) providers to deliver edge capabilities.

VaporEdge for Telecom can also help carriers upgrade their networks by providing a way to house and manage their standard rack-mountable equipment, including rack-mountable cloud servers as well as C-RAN, NFV, Mobile Edge Computing (MEC) and other specialised telecom equipment.VaporEdge for Telecom offers the following to wireless operators:

VaporEdge for Telecom supports todays 2G, 3G and 4G technologies, and is available for the coming 5G technologies that will provide extremely high-bandwidth,highly-reliableand very low latency wireless services.

VaporEdge for Telecom supports the concept of edge computing with or without MEC. Users can add server equipment toVaporEdge without requiring any changes to base station equipment, but also as base station equipment receives MEC upgrades.VaporEdge for Telecom interfaces with MEC APIs to coordinate work as well as expose real-time status of the RF network to cloud applications. Equipment providers can also useVaporEdge for Telecom to extend their MEC capabilities by housing MEC servers in the Vapor Chamber, which will allow for much more expansion capability than exists in standard base station equipment.

As operatorsvirtualisetheir networks and move their networking logic onto commodity off-the-shelf servers, those servers can be incorporated intoVaporEdge for Telecom. NFV servers can be housed in the Vapor Chamber located in a base station and can be instrumented with OpenDCRE for remote management withVaporEdge software.

VaporEdge helps operators move their Radio Access Network logic onto off-the-shelf servers.VaporEdge also allows the operator to incorporate their own private cloud for operating C-RAN at the base station.

VaporEdge allows public and private cloud providers, as well as OTT providers (such as mediacompanies)to have a presence in the base station and run workloads on servers in theVaporEdge environment, which can be used to offload work from centralised data centres.

To continue reading this article, please register for an extended free trial by going to the box below. If you are already a subscriber or a trialist, please log in...

Subscribers have unlimited access to all current and archive content. Start your subscription today - click on the button below.

Taking a free trial will give you access to all of Global Telecoms Business(possibly excluding some surveys and articles). Registration is quick. Start your free extended trial today.

Continue reading here:
Vapor IO launches edge micro data centre for telcos - Global Telecoms Business

Read More..

Now trust in public cloud services outnumbers distrust by more than 2-to-1: Intel Security – The Tech Portal


The Tech Portal
Now trust in public cloud services outnumbers distrust by more than 2-to-1: Intel Security
The Tech Portal
The report adds that most organizations are of the opinion that public cloud services are as or more secure than private cloud servers, and helps them lower the cost of ownership and overall data visibility. And it might be the first time but people ...
Cloud adoption is increasing, but so are the risks, Intel statesPCMag India
Security Skills Shortage Hits Cloud Deployment: StudyCXOToday.com

all 14 news articles »

Originally posted here:
Now trust in public cloud services outnumbers distrust by more than 2-to-1: Intel Security - The Tech Portal

Read More..

Lenovo to build and run SAP’s cloud in China – The Register

What China wants, China gets in this case an exception to SAP's usual practice of running its own cloud.

Behind the great firewall that job's just gone to Lenovo, which will deliver a new enterprise cloud solution created exclusively for customers holding licenses for the SAP HANA platform in China. Lenovo will also run and manage the rig for SAP's Chinese HANA customers.

The Lenovo Enterprise Cloud designed for SAP HANA will be a RAM-crammed cloud using the Chinese company's System x3850 and x3950 X6 servers, plus the System x3650 X6 for application servers. So a nice showcase for the company's kit, and a big customer to boot which will be a fillip after recent revenue disappointments in Lenovo's data centre group.

Perhaps more importantly, it will be a mighty symbol of Lenovo's smarts and ability to do big, complex, things for top software companies. SAP has partners elsewhere such as Deloitte and AWS - but The Register is unaware of a whole-of-country deal to rival this Lenovo tie-up.

The too-nauseatingly-confected-to-use canned quotes provided by both parties certainly leave no adjective unturned in pursuit of praising Lenovo's ability to take on this task.

But the back story here is that China doesn't let western tech concerns do business behind the great firewall without making a substantial offering to nourish local players and develop local expertise. Hence the raft of joint ventures and other deals big companies do in China, all with a national security source code review before the fun starts.

But SAP's not having its arm twisted here, because Lenovo can't afford to fail in this role. Who, after all, would want anything to do with the organisation that crocked SAP?

Read the original here:
Lenovo to build and run SAP's cloud in China - The Register

Read More..

Google Rolls Out GPU Cloud Service – TOP500 News

The largest Internet company on the planet has made GPU computing available in its public cloud. Google announced this week that it has added the NVIDIA Tesla K80 to its cloud offering, with more graphics processor options on the way. The search giant follows Amazon, Microsoft and others into the GPU rental business.

According to a blog posted Tuesday, a user can attach up to four K80 boards, each of which houses two Kepler-generation GK210 GPUs and a total of 24GB of GDDR5 memory. The K80 delivers 2.9 teraflops of double precision performance or 8.73 teraflops of single precision performance, the latter of which is the more relevant metric for deep learning applications. Since were talking about a utility computing environment here, a user may choose to rent just a single GPU (half a K80 board) for their application.

The initial service is mainly aimed at AI customers, but other HPC users should take note as well. Although Google has singled out deep learning as a key application category, the company is also targeting other high performance computing applications, including, computational chemistry, seismic analysis, fluid dynamics, molecular modeling, genomics, computational finance, physics simulations, high performance data analysis, video rendering, and visualization

Googles interest in positioning its GPU offering to deep learning is partially the result of the in-house expertise and software the company has built in this area over the last several years. The new cloud-based GPU instance have been integrated with Googles Cloud Machine Learning (Cloud ML), a set of tools for building and managing deep learning codes. Cloud ML uses the TensorFlow deep learning framework, another Google invention, but which is now maintained as an open source project. Cloud ML helps users employ multiple GPUs in a distributed manner so that the applications can be scaled up, the idea being to speed execution.

The Tesla K80 instance is initially available as a public beta release in the Eastern US, Eastern Asia and Western Europe. Initial pricing is $0.70 per GPU/hour in the US, and $0.77 elsewhere. However, that doesnt include any host processors or memory. Depending on what you want, that can add as little as $0.05 per hour (for one core and 3.75 GB of memory), all the way up to more than $2 per hour (for 32 cores and 208 GB of memory). For a more reasonable configuration, say four host cores and 15 GB of memory, an additional $0.20 per hour would be charged.

That would make it roughly equivalent to the GPU instance pricing on Amazon EC2 and Microsoft Azure, which include a handful of CPU cores and memory by default. Both of those companies, which announced GPU instances for their respective clouds in Q4 2016, have set their pricing at $0.90 per GPU/hour. For users willing to make a three-year commitment, Amazon will cut the cost to $0.425 per GPU/hour via its reserved instance pricing.

IBMs SoftLayer cloud also has a number of GPU options, but they rent out complete servers rather than individual graphics processors. A server with a dual-GPU Tesla K80, two eight-core Intel Xeon CPUs, 128 GB of RAM, and a couple of 800GB SSDs will cost $5.30/hour. Other K80 server configurations are available for longer terms, starting at $1,359/month.

At this point, HPC cloud specialist Nimbix has what is probably the best pricing for renting GPU cycles. Theyre offering a K80-equipped server so two GPUs with four host cores and 32 GB of main memory for $1.06/hour. Thats substantially less expensive than any others cloud providers mentioned, assuming your application can utilize more than a single GPU. Nimbix is also the only cloud provider that currently offers a Tesla P100 server configuration, although that will cost you $4.95 per hour.

Even though the initial GPU offering from Google is confined to the Tesla K80 board, the company is promising NVIDIA Tesla P100 and AMD FirePro configuration are coming soon. The specific AMD device is likely to be the FirePro S9300 x2, a dual-GPU board that offers up to 13.9 teraflops of single precision performance. When Google previewed its accelerator rollout last November, it implied the FirePro S9300 x2 would be aimed at cloud customers interested in GPU-based remote workstations. The P100 is NVIDIAs flagship Tesla GPU, delivering 5.3 or 10.6 teraflops of double or single precision performance, respectively.

At this point, Google is in third place in the fast-growing public cloud space, trailing Amazon and Microsoft, in that order. Adding a GPU option is not likely to change that, but it does illustrate that graphics processor-based acceleration is continuing to spread across the IT datacenter landscape. Whereas once GPU acceleration was confined to HPC, with the advent of hyperscale-based machine learning, it quickly became standard equipment for hyperscale web companies involved in training neural networks. Now that more enterprise customers are looking to mine their own data for machine learning purpose, the GPU is getting additional attention. And for traditional HPC, many of the more popular software packages have already been ported to GPUs.

This all might be good news for Google, but its even better news for NVIDIA, and to a lesser extent AMD, which still stands to benefit from the GPU computing boom despite the companys less cohesive strategy. NVIDIA just announced a record revenue of 6.9 billion for fiscal 2017, driven, in part, by the Tesla datacenter business. That can only get better as GPU availability in the cloud becomes more widespread.

Go here to see the original:
Google Rolls Out GPU Cloud Service - TOP500 News

Read More..

Banks Need to Be Centralized Could Blockchain be the Answer? – Finance Magnates (blog)

The queues in banks have lessened, the digital transformation is progressing, few people are for banks becomingredundant, and few believe banksshould rise and serve as the operating system. However, I vote banks to be centralized! And to some degree, Blockchain could help banks to achieve that.

Unlike the olden days when data/information was maintained in ledgers and files, now data can be digitaland stored on cloud servers. So, if the account information was fed in ABC city, why cant it be accessed from XYZ city?

Thanks to globalization that customer is not in one place now. He might have been born in a town and then moved to xnumber of cities for reasons from studies to job to marriage! And he cannot raise a transfer request each time.

Yes, the services are available online and could be managed. But in a critical situation like demonetization in India where people were asked only to visit their respective home branch, then it gets tough!

Or what about the situation when people need to change their mailing address, and this can only be done from their home branch.

Centralizing banks could help us in the following ways:

Blockchain could assist banks to achieve centralization using their permissioned network technology.

Experts believe that Blockchain has disruptive powers that could transform any process from a simple documentation to a complex cross-country settlement to be automated with just few clicks.

Adoption of Blockchain by banks would not only help people who use bank services but also individuals who are unbanked and looking to figure out what a bank is. Just imagine a person in rural India getting his biometrics done and submitting his eKYC, opening his new bank account and taking a loan for his farming! All this is achieved within a couple of minutes and is a possibility with Blockchain.

This article was written bySamiksha Seth, a research consultant.

Go here to see the original:
Banks Need to Be Centralized Could Blockchain be the Answer? - Finance Magnates (blog)

Read More..

VPSServer.com Expects Rapid Growth in 2017 – Satellite PR News (press release)

Submit the press release

New York, NY (SBWIRE) 02/21/2017 VPSServer.com, a cloud-computing company launched in 2016 by Global Cloud Infrastructure, announces that they are expecting significant growth throughout 2017 on the back of the massive growth analysts are expecting for the industry as a whole over the coming years. Founded in 2015 and based in the United States and Europe, Global Cloud Infrastructure launched the VPSServer.com product in 2016 based on the over 15 years of experience the organizations team has accumulated in the field of cloud computing. Server hardware today is becoming so powerful that it has become both feasible and economical to rent out just a small portion of a servers capacity, rather than the whole physical server itself.

A virtual private server can start at just five dollars per month and can be scaled to accurately meet the specific needs of the customer. Robert Bolder CEO of VPSServer, said: Three-way storage replication is enabled on our cloud hosting platform, which means that our virtual private servers are stored on three different servers at the same time. This is much safer than the approach taken by most of our competitors, who work with a single RAID system stored on just one hosting server. The first datacenter that VPSServer used to deploy cloud servers was located in Miami, Florida (USA). As a sort of hub to the Americas and beyond, Miami is ideally located to serve the growing demands of multiple markets. In order to minimize latency (i.e. the amount of time it takes for a packet of data to travel from the users computer to the datacenter and back again), customers have a preference for cloud servers that are closer to their actual location, as latency is directly related to how close the remote server is.

In response to this need, VPSServer expanded its platform in January 2017 to seven more datacenters based in both the United States and Europe. Erik Jan Visscher, CTO of VPSServer, said: Recently, we conducted some benchmark testing and saw amazing results. To test the server performance, we are using an independent benchmark tool called Serverscope.io. With our high-end processors and fully equipped SSD storage, our VPSs are outperforming all of our competition. Recently, Global Cloud Infrastructure closed a series

A round of funding with a veteran investor. With this second round of funding completed, the investment will ensure that Global Cloud Infrastructure can continue to grow at its current pace while maintaining the same standards of product quality. About Global Cloud Infrastructure Global Cloud Infrastructure (VPSserver.com), a cloud-server company founded in 2015 and with a presence in both the United States and Europe, is specialized in the provision of virtual private servers. VPSserver.com has more than 15 years of experience in the cloud computing industry, and with the organizations worldwide datacenters, high-end processors, and fully equipped SSD storage, VPSserver.com is outperforming all of its competitors.

CONTACT: Jeroen Nijholt Address: 5 Penn Plaza, 19th Floor PMB #19055, New York, NY 10001 Phone: +31614789523 Email: jeroen.nijholt@vpsserver.com Website: https://www.vpsserver.com/

Read the original post:
VPSServer.com Expects Rapid Growth in 2017 - Satellite PR News (press release)

Read More..

Altcoin Uptick Ends as Bitcoin Prices Pass $1050 – CoinDesk

While so-called altcoins enjoyed a notable rally early this week, this run appears to be over.

The market for alternative cryptocurrencies, acollection of digital tokens with different blockchain networks and value propositions than the bitcoin protocol, received significant tailwinds earlier this week as bitcoin traders sat on the sidelines waiting to hear the latest news from China.

As bitcoin trading volumes fell at many exchanges, many of the top cryptocurrencies listed on CoinMarketCap (ranked by market capitalization) enjoyed notable price gains, forming what could be referred to as a smaller altcoin rally.

Today, however, we saw a sharp reversal of this trend.

The ether markets provided a perfect example.

Ether, the digital currency that powers the smart contract-based blockchain platform ethereum,surged nearly 20% to $13.33 on CoinMarketCap on 14th February, reaching their highest value so far this year.

Analysts pointed to robust trading volume, a highly leveraged market and signs of rising adoption as helping fuel this price rise.

However, the digital currency has fallen back, dropping to $12.64 today. At the time of report, ether was trading at $12.71.

Monero, a privacy-oriented digital currency that leverages ring signatures to conceal user identities, also provided strong evidence that the rally enjoyed by altcoins earlier this week is now over.

The privacy oriented digital currency rose roughly 12% from $12.20 on 14th February to $13.74 on 15th February, its highest price in more than a month, CoinMarketCap figures reveal.

However, monero lost much of its recent gains when it dropped to $13.26 later in the 15th February session, and failed to mount any notable recoveries since that time.

Instead, the digital currency has moved largely within a reasonably tight range between $13.30 and $13.60.

Elsewhere, dash, a digital currency that offers users privacy and instant transactions, rose 8.6% to $19.66 on 15th February, but fell back to $18.75 later that session, CoinMarketCap figures show.

As for what caused the altcoin rally to end, one simple explanation is that cryptocurrency traders have once again become interested in bitcoin.

Far and away the market's largest asset, bitcoin has seen its price climb more than 5% since falling to $1,001 early in the 15th February session, CoinDesk USD Bitcoin Price Index (BPI) data shows.

At the time of report, bitcoin was trading at $1,056.92, 2.2% higher for the data, BPI figures reveal.

This recent uptick in price has coincided with a notable increase in volume, as market participants traded 16,000 bitcoins worth of transactions through Bitfinex in the 24 hours through 21:30 UTC.

This compares to an average of 9,700 bitcoins per session over the last seven days, Bitcoinity data shows.

Tim Enneking, chairman ofthe digital asset hedge fund Crypto Asset Management, explained in this way:

"Since bitcoin jumped up again, the alts have cooled off."

Melting snowman image via Shutterstock

AltcoinsPrices

Go here to see the original:
Altcoin Uptick Ends as Bitcoin Prices Pass $1050 - CoinDesk

Read More..

This Is The World’s First Cryptocurrency Issued By A Hedge Fund – Forbes


Forbes
This Is The World's First Cryptocurrency Issued By A Hedge Fund
Forbes
While launching a cryptocurrency brings legal, technical and business risks, it could also create the first hedge fund with a network effect.
Hedge Fund Numerai Launches its Own Cryptocurrency | Finance ...Finance Magnates

all 3 news articles »

Read the original post:
This Is The World's First Cryptocurrency Issued By A Hedge Fund - Forbes

Read More..