Page 3,878«..1020..3,8773,8783,8793,880..3,8903,900..»

Anand, the author, is just loving it – The Hindu

Viswanathan Anand, the author, is just loving it. Select city-hopping to promote his book, Mind Master and getting a first-hand feedback on his honest effort.

So you begin by asking the Grandmaster, how does it feel to be an author?

I would say it's very nice. But you have to do this with some enthusiasm, says Anand and elaborates, the point of the book is to share, add to the public idea of what, who or what you are.

I mean, most people have a certain impression of what a chess player is like, what you do and so on. And they now have something which is slightly closer to the truth.

That's why I felt that it was important to address this book as a chance to get your story out and, you know, you have to divulge stuff. So even though thats not something I normally do liberally, over the last few years, I found that I have opened up in many ways.

I talk more frankly. May be, it's just the sense that, now everything's in the past and I can talk about it without the pains and so on. But it's been fun for me. I had this feeling of something, you know, I've done it. Now I can't do anything more.

Did any of the chess players get back to you on this?

Quite a few players said they really liked it. Many players said they couldnt get it yet. I probably expect more feedback later. (Boris) Gefland said he loved it. He was in Chennai and he read it on the flight back home.

Kramnik didn't mention it. Anish (Giri) even tweeted something, you know, as usual. I'm sure if I probed he would tell me something. But I expect the few others will read it as well.

Away from the book, what does your schedule for 2020 look like?

For the moment. Ive got a tournament Grenke Chess Open in April. Hopefully I'll get an event in May. And then it's a fairly light year because I am not playing the Grand Chess Tour. I'll play the Olympiad, maybe. And if some new tournament turns up, I might go for that.

After all these years of complaining that the schedule was heavy, if I spend this year complaining, it doesn't make sense. So I thought I'll just go with the flow, keep working, keep training and also enjoy more time at home.

You have reached your limit for free articles this month.

Register to The Hindu for free and get unlimited access for 30 days.

Find mobile-friendly version of articles from the day's newspaper in one easy-to-read list.

Enjoy reading as many articles as you wish without any limitations.

A select list of articles that match your interests and tastes.

Move smoothly between articles as our pages load instantly.

A one-stop-shop for seeing the latest updates, and managing your preferences.

We brief you on the latest and most important developments, three times a day.

Not convinced? Know why you should pay for news.

*Our Digital Subscription plans do not currently include the e-paper ,crossword, iPhone, iPad mobile applications and print. Our plans enhance your reading experience.

Read the original here:
Anand, the author, is just loving it - The Hindu

Read More..

Matic nodes now available for deployment on the Ankr cloud – CryptoNinjas

Ankr, the distributed cloud sharing platform, today announced the launch of its one-click deployment solution for hosting Matic testnet nodes on the Ankr cloud.

Node hosting as a service has become a major-use case on Ankr; with over 25 blockchain protocols now added to its node hosting application platform.

We are thrilled to be working with our long-time partners Ankr on this use-case. Two of our highest priorities with our staking mechanism are providing seamless Validator onboarding and ensuring the highest possible level of decentralization; Ankrs one-click deployment makes it possible for non-techies to participate in our staking event as a Validator and their distributed infrastructure will ensure the utmost decentralization of our PoS Validator network. Sandeep Nailwal, Co-founder and COO of Matic Network

Recently, Ankr has also delivered node hosting applications for Binance Chain and Vite and is currently integrating and preparing collaborations with many more.

We are excited to launch our node hosting solution with our friends of Matic Network, and are very proud of everything they achieved since we first met in Mumbai. We are happy to help the Matic community to easily host their nodes on Ankr; and further decentralize the network during this stage and in the future. Ryan Fang, Co-founder and COO of Ankr

Ankr also announced today a strategic partnership with BiKi Mining Pool, the staking service provider of BiKi Exchange. Ankr will conduct in-depth collaboration with BiKi Mining Poolto integrate high-profile blockchain nodes and provide the infrastructure and resources to further expand and jointly develop the Biki Staking ecosystem.

Read the rest here:
Matic nodes now available for deployment on the Ankr cloud - CryptoNinjas

Read More..

The National Archives is looking for some more cloud – FedScoop

Written by Tajha Chappellet-Lanier Feb 4, 2020 | FEDSCOOP

The National Archives and Records Administration (NARA) needs moreflexibility and efficiency from its cloud services.

The independent agency posted a request for information recently, soliciting feedback from companies on a potential plan to replace its enterprise cloud contract. The contract listing is called Platform & Infrastructure for Cloud Archives & Records Depositories or, in acronym form, PICARD.

The requirements of a new cloud contract spring from the directive that NARA transition to fully electronic record keeping, and stop collecting paper records, by the end of 2022.

Beginning January 1, 2023, all other legal transfers of permanent records must be in electronic format, to the fullest extent possible, regardless of whether the records were originally created in electronic formats, an Office of Management and Budget memo from July states. After that date, agencies will be required to digitize permanent records in analog formats before transfer to NARA.

NARA currently uses Amazon Web Services, but is considering moving to multi-cloud. The agency recognizes that other agencies transferring records to NARA could be transferring them from any cloud hosting environment and NARA needs to be prepared for ensuring efficient transfer of records into its legal and logical custody.

The agency also has new functionalities in mind, and as such requires the flexibility to expand its cloud presence.

Responses to the RFI are due Feb. 20.

Read this article:
The National Archives is looking for some more cloud - FedScoop

Read More..

UKCloud survey reveals why public sector organisations are turning to multi-cloud – Intelligent CIO Africa

The following survey shows the results of 300+ senior public sector IT professionals and business leaders, revealing the challenges around cloud adoption.

UKCloud, the multi-cloud experts dedicated to making transformation happen across the UK public sector, has announced the results of a survey of more than 300 senior IT professionals and business leaders that reveals the key challenges and issues that are affecting cloud adoption.

The UK government adopted a cloud first policy in 2013 which signalled a significant shift from traditional IT solutions to more agile, scalable and cost-effective cloud solutions that enable the broader Digital Transformation agenda. But while there has been some successful use of cloud, the majority of public sector IT has not yet made the shift. Last year, Crown Commercial Service (CCS) and the Government Digital Service (GDS) commenced a review of the cloud first policy, while organisations such as NHSX and Defence Digital were formed to address specific challenges of harnessing innovative technologies to drive better public services.

We have been monitoring the adoption of cloud-based services across the UK for the past 10 years. We have seen unprecedented change take place in that time, with many companies now realising the potential of cloud services helping them fulfil their Digital Transformation goals. These journeys may have started a long time ago, but they are far from over: rapidly evolving business challenges mean that diversity and collaboration are necessary to move forward. A cloud-led strategy must be at the heart of any Digital Transformation, said Alex Hilton, Chief Executive at the Cloud Industry Forum.

Finding #1: Organisations are cloud keen and want to focus on outcomes

The results of the survey confirm that there is almost a universal desire to shift from traditional IT environments to cloud solutions. A total of 87.2% of those surveyed stated that they would do this if a perfect solution existed. These results were reflected at all levels, across business and technical respondents and 82% of respondents agreed that the senior leadership in their organisation understands and values progressive technology. Yet, the survey also found that more than three-quarters cited lack of clear policy/strategy as a factor impeding cloud adoption which supports the policy review that CCS and GDS have undertaken, as the survey also revealed more focus is needed on the technical and non-technical challenges of modernising existing technology, not just building new cloud-native applications.

Finding #2: Significant concern about commercial risks of singleprovider solutions

There is a lot of focus on the technical challenges of adopting cloud but one of the key revelations from the survey was the extent of concern of the commercial risks of cloud adoption. A total of 78% of respondents expressed a fear of vendor lock-in and a similar number agreed that the risk of over reliance on a sole provider is inhibiting their cloud adoption. More than 85% agreed that they would prefer multi-cloud, presumably as a means to mitigate these commercial risks. These concerns arent exclusive to the UK, last year Angela Merkel spoke of Gaia-X to help the EU avoid becoming over-reliant on US-based cloud providers.

Finding #3: Operational and security risks constrain wholesaleadoption of public cloud

Another aspect of risk that came through the survey results is related to security and operational risks to live systems. A total of 85.2% of those surveyed believe that their organisation is reluctant to move workloads to the cloud due to risk and security concerns. As reported by the National Cyber Security Centre, a significant number of cyberattacks are from hostile nation states and the survey results imply that many organisations are concerned that traditional applications are inherently not ready for these emerging threats.

Hence,there remains a significant minority that still will not consider public cloudfor their most secure and sensitive systems and 40% also ruled out public cloudfor systems that they need to run on-premises or in Crown Hosting. This couldimply that organisations feel constrained by the hyperscale model of publiccloud whereas a multi-cloud strategy would enable them to consider a varietyof cloud offerings (including private cloud and secure cloud) that have beenspecifically designed for these more sensitive and critical environments.

Finding #4: Disproportionate focus on cloud-native skills and capabilities

The fourth finding is generally well publicised. A total of 78.3% of respondents confirmed they lacked the skills and resources such as DevOps and automation, to build and operate cloud-native applications in what Gartner refers to as bimodal capability.

While this is necessary to get the best out of hyperscale platforms, multi-cloud enables organisations to carry forward their existing skills in established technologies like VMware, Red Hat and Cisco which remain relevant for longer. And secondly, multi-cloud enables organisations to consider buying specialist SaaS solutions rather than building their own cloud-native applications tapping into the skills and capabilities of innovative software companies.

Finding #5: Clear need for more commercial control and flexibility

Last but by no means least are the findings around the affordability and budgeting challenges that some public sector organisations are experiencing. A total of 84.5% of respondents agree that cost/affordability is the biggest impediment to cloud adoption, with almost 80% agreeing that fear of runaway costs is a notable hindrance. This supports the concept of cloud repatriation, where organisations bring unsuited workloads back from the public cloud and demand for tools like VMware CloudHealth which helps organisations better understand the costs they are incurring in the public cloud.

Moreover, 82.57% of those surveyed also cited the misalignment of CAPEX and OPEX budgets as an impediment of cloud adoption. Public cloud services by their nature are only suited to OPEX budgets. Multi-cloud expands these options so that customers can choose to mix dedicated environments with shared environments so that they can make best use of both their CAPEX and OPEX budgets.

Read the rest here:
UKCloud survey reveals why public sector organisations are turning to multi-cloud - Intelligent CIO Africa

Read More..

Build your own cloud infrastructure with Nextcloud and Collabora – IT PRO

Cloud services have revolutionised the way we work, providing easy paths for collaboration over a distance and business-critical features such as automatic off-site backups and version control.

However, potential issues range from not wanting to entrust data to the uncanny tentacles of global megacorporations, to very limited options for intranet-only deployments and the cost of per-user licensing for bundled services that not all users need.

Setting up your own cloud services can in some cases provide financial savings, but it will certainly provide greater control over the data youre responsible for and the way your users can access it.

That can be a distinct advantage when it comes to data protection and financial services regulation. Note, though, that you will be responsible for securing and updating the software that runs your cloud, rather than being able to leave that to a third party.

Advertisement - Article continues below

Advertisement - Article continues below

Well guide you through setting up open source cloud storage suite Nextcloud and Collabora Online, which adds online collaborative document editing to Nextcloud, as well as a few more common business cloud features. For brevity and convenience, well be using containerised versions of the software distributed using Snap and Docker.

The latest version, Nextcloud 18 Hub, includes support for an integrated OnlyOffice editing environment on the same server. This is bleeding edge stuff, so for both this reason and because the OnlyOffice Community Edition it uses supports just 20 simultaneous open documents, we've opted not to use this approach for our tutorial.

Instead, well guide you through setting up the current stable version 16 snap release of Nextcloud and the more fully-featured Collabora document editing environment on a dedicated server, as this is more appropriately scaleable to the needs of most businesses.

In our example deployment, weve given Nextcloud and Collaboras servers each a dedicated VM. The required spec will vary depending on how many users you have, how much they need to store and how frequently theyll access storage and edit documents.

A very basic setup - suitable for a small business or department of three to ten people - works smoothly with a single core and 1GB RAM for the Nextcloud server, and two cores and 2GB RAM for Collabora. The extra memory is particularly important here if you expect multiple users to work on their documents at the same time.

Unless you have very high storage capacity requirements, we suggest using an SSD-based system to improve responsiveness. This tutorial was written using virtual servers hosted on Vultr and that services default Ubuntu 18.04 image, but applies to any comparable virtual or hardware server configuration.

Set up an Ubuntu 18.04 server. If your install image doesnt prompt you to do so, create a new user, add them to the sudoers group and update the server. Youll be logging in as that new user, rather than as root, whenever you need command line access to the Nextcloud server.

Now, were ready to install the Nextcloud snap package, which packs in all required dependencies.

To configure your Nextcloud, connect to your servers IP address in a web browser and follow the prompts to create an admin account. Congratulations, you now have a basic cloud storage server.

To make it easily accessible and appropriately professional looking, well want a domain name for it either a new domain or a subdomain of your existing web address will work well.

Advertisement - Article continues below

With an appropriate domain name registered or subdomain selected, create an A record in your registrars DNS management portal pointing at your new Nextcloud server's IP address.

Now well have to tell your Nextcloud instance what its domain name is. Log in to the server at the command line.

Add your new domain name under trusted_domains, save changes and you should now be able to immediately access your Nextcloud from that URL.

With that done, its time to run through Nextclouds recommended server security tweaks, most importantly HTTPS support.The Nextcloud snap comes with built-in support for generating a LetsEncrypt certificate, so just run:

Then follow through the certificate creation process for your domain name. The Nextcloud Snap includes an integrated auto-renewal routine, but you can also renew your certificates at any point by re-running the creation command above.

Nextcloud needs to be able to communicate with your users for everything from registration emails to editing invitations, so youll need an SMTP server that it can send outbound emails through.

In this example, were integrating Nextcloud with a business that uses G-Suite, so well use Gmail as our SMTP server. However, third-party SMTP providers of this kind may require some extra configuration on their end to work. In this instance, we had to reduce security to allow access. If users arent allowed to manage their own less secure apps, youll have to grand them this permission in the G-Suite admin panels Advanced Security Settings.

If youre testing Nextcloud and using a standard Gmail account for SMTP, youll find the same setting in your personal Google Account Security options.

Advertisement - Article continues below

If you run your own mail server, youll want to create a user for Nextcloud and point it at that.

At this point, you should add a recovery email address and ideally enable two-factor authentication for your admin account. Once you roll Nextcloud out to your users, you should strongly encourage them to do the same.

Advertisement - Article continues below

If all you need is online storage, you're ready to invite users, but if you want to provide more advanced cloud services and apps, such as document editing, youll want to add a few more features.

Click on your profile icon and select Apps. Here, you'll see all the default features of Nextcloud, such as its gallery display for images, plain text editor and PDF viewer, as well as any pending updates for them.

In the pane on the left, a category list lets you view a full range of official and third-party Nextcloud apps. There's a lot here, so you'll want to take a look through everything to see what your users are likely to need.

Nextcloud's app library includes Google Drive, Microsoft OneDrive and Dropbox integrations that can help users transfer files from third-party cloud services to Nextcloud, multimedia file playback and conversion, single sign-on and additional two-factor authentication support, web form creation, WebRTC-based video and voice conferencing, end-to-end encryption and real-time tracking of associated mobile devices, as well as more traditional office suite functionality.

Advertisement - Article continues below

For this tutorial, we're going to add a calendar, task list, and contact management. Go to Office & text and select Download and enable on Calendar, Contacts and Tasks. You may be prompted to enter your password. Once you've added these and returned to the main Nextcloud interface, you'll be able to access these via extra buttons that'll appear on the interface's top bar.

A fast guide to finding your cloud solution

One size doesn't fit all in the cloud, so how do you find the best option for your business?

Nextcloud includes a simple integrated text editor by default, but if you need proper online document creation and editing, the Nexcloud Collabora Online app is an elegant solution. To use it, however, you'll need to set up a Collabora Online server.

Based on LibreOffice, Collaboras features include full version control, commenting, collaborative document editing, and it allows you to create word processor documents, spreadsheets and presentations. Documents are saved in standard Open Document formats, and the synced versions thatll be saved on users devices can be opened in any compatible word processor, although you only get access to collaborative editing via the web interface.

Collabora is available as a Docker image. As it can become rather memory-hungry if you've got lots of users editing documents at the same time, we recommend giving it its own server, which also makes life a little easier when it comes to setup and configuration.

Spin up a fresh Ubuntu 18.04 server and update it. We'll be expanding on Nextcloud's official Collabora deployment instructionsfor this section and working on the assumption that Collabora will only need to serve a single Nextcloud instance.

While some previous iterations of Docker liked to run as root, which is reflected in the Collabora setup instructions linked above, you can and should use a normal user in the sudoers group. So, if your installation image doesnt do this for you by default:

adduser usernameadduser username sudosu usernamesudo apt updatesudo apt dist-upgrade

sudo apt install docker.iosudo docker pull collabora/codesudo docker run -t -d -p 127.0.0.1:9980:9980 -e 'domain=subdomain\.yournextclouddomain\.tld' -e 'dictionaries=en en-gb' --restart always --cap-add MKNOD collabora/code

These parameters include British, as well as US, English dictionaries - you can add others as needed. The path specified in the docker run command above must match the URL that your users will be using to connect to Nextcloud.

Advertisement - Article continues below

Youll also need Apache to act as a forward proxy for the Docker images.

Before you can configure it properly, though, well need to set up TLS certificates for the subdomain itll be using. Were again using Lets Encrypt certificates in this tutorial.

In this particular configuration, the easiest option is to stop Apache before using the Lets Encrypt certbots certificate-only generation mode.

Enter the domain name you want to use for the Collabora server - we suggest using a subdomain on the same domain youre using for your Nextcloud server. Remember to create an A record in your DNS settings to point the subdomain at your new Collabora server before you try to generate the certificate.

Certbot automatically sets up a cron job to handle the required three-monthly renewals of Lets Encrypt certificates, but well have to make a couple of modifications to make sure it stops and restarts Apache properly. First, test renewal, as itll have to stop your server.

If that runs without any errors, we need to create scripts for those pre and post hooks into the appropriate directories.

sudo nano /etc/letsencrypt/renewal-hooks/pre/stop_apache#!/bin/bashservice apache2 stop

sudo nano /etc/letsencrypt/renewal-hooks/post/start_apache#!/bin/bashservice apache2 start

sudo chmod u+x /etc/letsencrypt/renewal-hooks/post/start_apachesudo chmod u+x /etc/letsencrypt/renewal-hooks/pre/stop_apache

See if these are working by runningsudo certbot renew --dry-run

You can also confirm that theres a systemd timer in place for certbot thus:

Were using Apache as a proxy here, so youll need to enter the URL of the Collabora Online server the one you just got a certificate for and the path to the certificates we created earlier.

ServerName your.collabora.subdomain:443

# SSL configuration, you may want to take the easy route instead and use Lets Encrypt!SSLEngine onSSLCertificateFile /etc/letsencrypt/live/certificate.domain.here/cert.pemSSLCertificateChainFile /etc/letsencrypt/live/certificate.domain.here/chain.pemSSLCertificateKeyFile /etc/letsencrypt/live/certificate.domain.here/privkey.pemSSLProtocol all -SSLv2 -SSLv3SSLCipherSuite ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-G$SSLHonorCipherOrder on

# Encoded slashes need to be allowedAllowEncodedSlashes NoDecode

# Container uses a unique non-signed certificateSSLProxyEngine OnSSLProxyVerify NoneSSLProxyCheckPeerCN OffSSLProxyCheckPeerName Off

# keep the hostProxyPreserveHost On

# static html, js, images, etc. served from loolwsd# loleaflet is the client part of LibreOffice OnlineProxyPass /loleaflet https://127.0.0.1:9980/loleaflet retry=0ProxyPassReverse /loleaflet https://127.0.0.1:9980/loleaflet

# WOPI discovery URLProxyPass /hosting/discovery https://127.0.0.1:9980/hosting/discovery retry=0ProxyPassReverse /hosting/discovery https://127.0.0.1:9980/hosting/discovery

# Main websocketProxyPassMatch "/lool/(.*)/ws$" wss://127.0.0.1:9980/lool/$1/ws nocanon

# Admin Console websocketProxyPass /lool/adminws wss://127.0.0.1:9980/lool/adminws

# Download as, Fullscreen presentation and Image upload operationsProxyPass /lool https://127.0.0.1:9980/loolProxyPassReverse /lool https://127.0.0.1:9980/lool

# Endpoint with information about availability of various featuresProxyPass /hosting/capabilities https://127.0.0.1:9980/hosting/capabilities retry=0ProxyPassReverse /hosting/capabilities https://127.0.0.1:9980/hosting/capabilities

sudo service apache2 restart

Your Collabora server should now be good to go. Log into your Nextcloud web interface as an admin, open the Settings screen and scroll down the left-hand pane until you get to Collabora Online Development Edition.

Click on that and enter the URL and port of your Collabora subdomain your.collabora.subdomain:443 and click apply.

With all your apps up and running, its finally time to invite your users. In the Nextcloud web interface, click on your profile icon on the top right of the interface and select users.

Create new users by clocking the +New user button in the left-hand pane then filling in the account settings you want to give them in the entry that appears at the to of the user list on the right.

Advertisement - Article continues below

You can set a password for them, which they should change after first logging in, and inform them of it. Alternately you can leave this password field blank and have them use the password reset feature to create their own password.

When they first log in, users should set their language and locale preferences in the Personal info section of the settings screen, again accessible by clicking on the user icon at top right. Locale determines the first day of the week for the calendar, which is set to the US Sunday-first system by default, and the language in which days are named.

As well as the website, client applications are available for Windows, Linux, macOS, Android and iOS. Users will be prompted to download these when they first connect, and they're always available via the Mobile & desktop entry in the settings screen, accessible by selecting settings from the menu that appears when you click on your user icon at top right.

Users can also search for Nextcloud clients in mobile app stores and link them by manually entering your cloud server's URL. This section also includes links to information for syncing calendar and contact data.

If you want your Android users to be able to edit documents from the Nextcloud mobile app, you should use your device management system to roll out the apk file that can be downloaded directly from Nextcloud or have them install the app from the F-Droid store, as the Google Play version is, at time of writing, lagging behind when it comes to support for Collabora Office.

The mobile apps Auto upload options allow you to select specific directories on your phone to be automatically backed up to Nextcloud, whether thats your photo gallery or critical document folder.

Assuming youve enabled the contacts app for Nextcloud, the mobile clients will be able to automatically back up your contacts to the service every day, and you can use Bitfires DAVx5 app for real-time calendar and contact syncing once added, Nextcloud calendars and contacts can be accessed via your preferred app.

Tasklist management for the Tasks feature is supported in DAVx5 via OpenTasks for Android, which users are prompted to install. Sync schedules can be customised as needed, with features including the option of only syncing over Wi-Fi, and everything works seamlessly in the background.

The desktop clients let you configure file syncing and create a default Nextcloud folder, whose contents will be automatically kept in sync with your Nextcloud server. You'll want to set it to launch on system startup. You can also apply bandwidth limits and throttling, which may be helpful to those working from home or on the road.

Nextcloud and its broad range of apps and connectivity tools have great potential for any business that wishes to either switch away from or supplement third-party cloud services.

For a more customised installation or to support large numbers of users, you may wish to build from source once youve familiarised yourself with Nextclouds systems, but the containerised versions of Nextcloud and Collabora are regularly updated and meet the core requirements for a small self-managed business cloud.

Report: The State of Software Security

This annual report explores important trends in software security

A fast guide to finding your cloud solution

One size doesn't fit all in the cloud, so how do you find the best option for your business?

Digitally perfecting the supply chain

How new technologies are being leveraged to transform the manufacturing supply chain

Small & Medium Business Trends Report

Insights from 2,000+ business owners and leaders worldwide

Read this article:
Build your own cloud infrastructure with Nextcloud and Collabora - IT PRO

Read More..

Web hosting firms to take down unsavoury content without court orders – Law Society of Ireland Gazette

The framework identifies four types of web content abuse that a domain name registry or registrar should act to disrupt without waiting for a court order.

These four types of content referred to, are:

A total of 48 companies have committed to the framework so far.

But solicitor and child law expert Geoffrey Shannon has said the moves do not go far enough.

This is a profound child safety issue, he pointed out.

The law needs to keep pace with changes in technology and we need to ensure that there are adequate takedown procedures for harmful material, he said.

A voluntary code between internet service providers does not meet this need, the child law expert says.

The current, non-statutory and voluntary code of self-regulation does not go far enough.

We need a digital safety commissioner, who should have a dual role of enforcing an effective and efficient takedown of harmful material in a timely manner, as well as promoting digital safety for all our children, he said.

Proper statutory controls have been long-promised Geoffrey Shannon says, and the time is now right to bring them in.

We're not in the business of internet censorship, and there are areas which are quite properly the domain of the police and the courts, says Blacknights chief executive Michele Neylon.

Blacknight made the announcement ahead of Safer Internet Day, which falls tomorrow, 11 February.

The day is being marked by schools and youth organisations that wish to promote internet safety and tackle online crime.

The important thing is to develop cooperation between internet companies, says Neylon.

We already have these policies in place, as do many other companies, but the internet is a distributed system. A sites domain name is often registered with one company, while it points to content hosted by one or more others.

Blacknight hosts almost one-third of Irelands websites, and provides email, connectivity and cloud services to almost 100,000 customers around the world.

Internet technology has allowed small businesses to flourish and grow without having to be in big towns or cities. I can just as easily work with colleagues in the US as I can with people in Siberia or West Clare, says Neylon.

But, for all of that to work, there needs to be a sense of trust in the entire system. Trust is not something that can be forced. Its something thats earned and every single time you have a bad experience online, that will erode your trust in the system, she says.

Maintaining that trust is a shared responsibility, says Neylon, given the Internets nature as an open platform for the exchange of information.

No one person or organisation owns the internet, and infrastructure providers such as Blacknight are unable to vet every piece of content that is published, she points out.

Internet service providers like us take very seriously our responsibility in this regard, which is why the Internet Service Provider Association of Ireland (ISPAI) operates thehotline.ieservice in cooperation with the Garda, she said.

Hotline.iewas established in 1998 to provide a free, secure and anonymous service, where the public can report suspected illegal online content, including child-sexual-abuse material, human trafficking, hate speech, and financial scams.

More than 12,000 reports were received by the service in 2018.

See more here:
Web hosting firms to take down unsavoury content without court orders - Law Society of Ireland Gazette

Read More..

This cloud service will streamline your workflow and simplify your life – Mashable

Just to let you know, if you buy something featured here, Mashable might earn an affiliate commission.

Image: Pixabay

By StackCommerceMashable Shopping2020-02-10 15:15:23 UTC

TL;DR: Get a 10-year subscription to Rethink Files for just $49 with this great sale.

You know what's annoying? Searching your entire Google Drive for a single document, only to discover you saved it to your Dropbox account. Sound familiar? Then you need to know about Rethink Files, a universal file hub that houses all your cloud storage files in one place.

Most of us have files all over the damn place from Google Drive, Dropbox, and Outlook to OneDrive, Slack, and Trello. And it's not just an inconvenience, it's also a major time suck. Rethink Files offers a much simpler, less headache-inducing solution. It connects all your other cloud storage accounts so you can access everything you need in one slick interface. Oh, and you'll get a massive 2TB of secure storage. That's plenty of space, even if you're a photographer or video junkie.

The interface is similar to that of Google Drive, with a streamlined design and colorful icons showing you the different file types. Speaking of which, Rethink Files also shows you rich previews for over 100 different file types. Plus, every file you store with Rethink is maintained and kept private using AES 256-bit encryption. Let's not ignore the importance of security these days.

If you run a company or a small team? Even better. Lots of companies rely on Rethink Files, with the CTO of ConversionFly noting "Rethink Files has helped our company keep our assets organized, saving us a ton of time and headache from having to share and access files across different platforms."

Now about the cost: a professional subscription to Rethink Files typically costs $15 per month. Mashable readers, however, can get a decade of universal storage for just $49. That's a little over 40 cents per month, but who's counting?

Link:
This cloud service will streamline your workflow and simplify your life - Mashable

Read More..

We Need to Talk About Cloud Neutrality – WIRED

We spent a lot of years talking about net neutralitythe idea that the companies that provide access to the internet shouldnt unfairly block, slow down, or otherwise interfere with traffic even if that traffic competes with their services. But theres an even bigger issue brewing, and its time to start talking about it: cloud neutrality.

While its name sounds soft and fluffy, Microsoft president and general counsel Brad Smith and coauthor Carol Ann Browne write in their recent book, Tools and Weapons: The Promise and the Peril of the Digital Age, in truth the cloud is a fortress. Their introduction describes the modern marvel of the data center: a 2 million-square-foot, climate-controlled facility made up of colossal electrical generators, diesel fuel tanks, battery arrays, and bulletproof doors. At its center is what they call a temple to the information age and cornerstone of our digital lives: thousands of machines connected to the fastest possible internet connections, providing offsite storage and computing power to businesses that otherwise couldnt possibly afford the hardware for all that storage and computing power.

Smith and Browne note cheerfully that Microsoft operates or leases more than 100 such facilities in 20-plus countries and hosts at least 200 online services. Each data center costs hundreds of millions of dollars to build and many millions more to maintain; and you pretty much cant build a successful new company without them. So, thank goodness for Microsoft, right?

The book means to portray this might and power as both a source of wonder and an enabling feature of the modern economy. To me, it reads like a threat. The cloud economy exists at the pleasure, and continued profit, of a handful of companies.

The internet is no longer the essential enabler of the tech economy. That title now belongs to the cloud. But the infrastructure of the internet, at least, was publicly financed and subsidized. The government can set rules about how companies have to interact with their customers. Whether and how it sets and enforces those rules isnt the point, for now. It can.

Thats not the case with the cloud. This infrastructure is solely owned by a handful of companies with hardly any oversight. The potential for abuse is huge, whether its through trade-secret snooping or the outright blocking, slowing, or hampering of transmission. No one seems to be thinking about what could happen if these behemoths decide its against their interests to have all these barnacles on their flanks. They should be.

Almost every modern tech company is paying to outsource its storage and computing services, either all or in part, to the cloud. This setup allows startups to emerge with very little overhead, and huge companies to run more efficiently by avoiding investment in physical hardware. It has spawned a generation of companies that plan to use the cloud to offer everything as a service.

But turn that transaction around, and you realize that the companies that actually built and operate the cloud are essentially incubating and hosting their competition. One easy example? Netflix runs its streaming video product on the cloud-based Amazon Web Services; indeed, it was widely praised for saving money by going all in on AWS in 2009 and 2010. Amazon started its own streaming service in 2011. The two have coexisted for a decade now, but how long will the famously ruthless Amazon tolerate that situation?

The problem is that few have the resources to replicate the cloud infrastructure, should the landlords suddenly turn on their tenants.

The big three cloud providers in the world are Amazon, Google, and Microsoft. Theyve collectively spent tens of billions of dollars on data center infrastructure. And to be clear, they have profited handsomely from those investments. Just last week, in fact, Alphabet revealed its cloud services revenue for the first time: it accounts for nearly $9 billion of the companys $37.57 billion in quarterly earnings, up more than 50 percent from 2018. Amazons AWS business made almost $10 billion, and Microsofts Azure business made almost $12 billion. Cloud computing was a $141 billion market in 2018.

Read more here:
We Need to Talk About Cloud Neutrality - WIRED

Read More..

Why fast object storage is poised for the mainstream – Blocks and Files

Four years ago, Pure Storage pioneered fast object storage with the launch of its FlashBlade system.Today fast object storage is ready to go mainstream, with six vendors touting the technology.

Object storage has been stuck in a low performance, mass data store limbo since the first content-addressed system (CAS) was devised by Paul Carpentier and Jan van Rielat FilePool in 1998. EMC bought FilePool in 2001 and based its Centera object storage system on the technology it acquired.

Various startups including Amplidata, Bycast, CleverSafe, Cloudian, Scality developed object storage systems. Some were bought by mainstream suppliers as the technology gained traction For instance, HGST bought Amplidata, NetApp bought Bycast and IBM bought CleverSafe.

Objects became the third pillar of data storage, alongside block and file. It was seen as ideal for unstructured data that didnt fit in the highly structured database world of block storage or the less highly structured file world. Object storage strengths include scalability, ability to deal with variably-sized lumps of data, and metadata tagging.

Object storage systems typically used disk storage and scale-out nodes. They did not take all-flash hardware on board until Pure Storage rewrote the rules with FlashBlade in 2016. Since then only one other major object storage supplier NetApp with its StorageGRID has focused on all-flash object storage. This is a conservative side of the storage industry.

Commonsense is one reason for industry caution. Disk storage is cheaper than flash and object storage data typically does not require low latency, high-performance access. But this is changing, with applications such machine learning requiring fast access to millions of pieces of data. Object storage can now be used for this kind of application because of:

A look at products from MinIO, OpenIO, NetApp, Pure Storage, Scality and Stellus shows how object storage technology is changing.

MinIO develops open source object storage software that executes very quickly. It has run numerous benchmarks, as we have covered in a number of articles. For instance:

MinIO has demonstrated its software running in the AWS cloud, delivering more than 1.4Tbit/s read bandwidth using NVMe SSDs. It has added a NAS gateway that is used by suppliers such as Infinidat. Other suppliers view MinIO in a gateway sense too. For example, VMware is considering using MinIO software to provision storage to containers in Kubernetes pods, and Nutanixs Bucket object storage uses a MinIO S3 adapter.

All this amounts to MinIO object storage being widely used because it is fast, readily available, and has effective S3, NFS and SMB protocol converters.

OpenIO was the first object storage supplier to demonstrate it could write data faster than 1Tbit/sec. It reached 1.372Tbit/s (171.5GB/sec) from an object store implemented across 350 servers. This is faster than Hitachi Vantaras high-end VSP 5500s 148GB/sec but slower than Dell EMCs PowerMax 8000 with its 350GB/sec.

The OpenIO system used an SSD per server for metadata and disk drives for ordinary object data, with a 10Gbit/s Ethernet network. It says its data layer, metadata layer and S3 access layer all scale linearly and it has workload balancing technology to pre-empt hot spots choke points occurring.

Laurent Denel, CEO and co-founder of OpenIO, said: We designed an efficient solution, capable of being used as primary storage for video streaming or to serve increasingly large datasets for big data use cases.

NetApp launched the all-flash StoregeGRID SGF6024 in October 2019. The system is designed for workloads that need high concurrent access rates to many small objects.

It stores 368.6TB of raw data in its 3U chassis and there is a lot of CPU horsepower, with a 1U compute controller and 2U dual-controller storage shelf (E-Series EF570 array).

Duncan Moore, head of NetApps StorageGRID software group, said the software stack has been tweaked and there is scope for more improvement. Such efficiency was not needed before as the software had the luxury of operating in disk seek time periods.

FlashBlade was a groundbreaking system when it launched in 2016 and it still is. The distributed object store system uses proprietary hardware and flash drives and was given file access support from the get-go, with NFS v3. It now supports CIFS and S3, and offers up to 85GB/sec performance.

Pure Storage markets FlashBlade for AI, machine learning and real-time analytics applications. The company also touts the system as the means to handle unstructured data in network-attached storage (NAS), with FlashBlade wrapping a NAS access layer around its object heart.

The AIRI AI system from Pure, with Nvidia GPUs, uses FlashBlade as its storage layer component.

Scality is a classic object storage supplier which has seen an opening in edge computing locations.

The company thinks object storage on flash will be selected for edge applications that capture large data streams from mobile, IoT and other connected devices; logs, sensor and device streaming data, vehicle drive data, image and video media data.

The data is used by and needed for local, real-time computation and Scality supports Azure Edge for this.

Stellus Technologies, which came out of stealth last week, provides a scale-out, high-performance file storage system wrapped around an all-flash, key:value storage (KV store) software scheme. Key:value stores are object storage without any metadata apart from the objects key (identifier).

An object store contains an object, its identifier (content address or key) and metadata describing the object datas attributes and aspects of its content. Object stores can be indexed and searched using this metadata. KV stores can only be searched on the key.

Typically KV Stores contain small amounts of data while object stores contain petabytes. Stellus gets over this limitation by having many KV stores up to 4 per SSD, many SSDs and many nodes.

The multiple KV stores per drive and an internal NVMe over Fabrics access scheme provides high performance using RDMA and parallel access. This is at least as fast as all-flash filers and certainly faster than disk-based filers, Stellus claims.

There are two main ways of accelerating object storage. One is to use flash hardware with a tuned software stack, as exemplified by NetApp and Pure Storage. The other is to use tuned software, with MinIO and OpenIO following this path.

Stellus combines the two approaches, using flash hardware and a new software stack based in key:value stores rather than full-blown object storage.

Scality sees an opening for all-flash object storage but has no specific version of its RING software to take advantage of it yet.Blocks & Files suggests that Scality will develop a cut-down and tuned version for edge flash object opportunities, in conjunction with an edge hardware system supplier.

We think that other object storage suppliers, such as Cloudian, Dell EMC (ECS), Hitachi Vantara, IBM and Quantum, will conclude they need to develop flash object stores with tuned software. They can see the possibilites of QLC flash lowering all-flash costs and the object software speed advances made by MinIO, OpenIO and Stellus.

More here:
Why fast object storage is poised for the mainstream - Blocks and Files

Read More..

Choosing the right disaster recovery for your business – ComputerWeekly.com

Historically, building and maintaining a disaster recovery (DR) site, while critical to ensure business continuity, was often too costly and complex for most companies.

As Rajiv Mirani, chief technology officer at Nutanix, points out: It simply wasnt feasible for many enterprises to pay for the upfront costs and ongoing expenses of maintaining a second site, to be used only in the event of a disaster.

From a DR perspective, the starting point for most large enterprises is their core IT infrastructure, which is often based on a primary on-premise datacentre or private cloud.

This is then supported by a secondary DR site at a separate geographic location. There, the core and primary systems and data are backed up and replicated, ready for activation in the event of the primary site suffering a failure and no longer being able to serve the business in a reasonable capacity.

Clearly, replicating an entire datacentre, with all its equipment, management, cooling and power needs, is a huge expense. Some businesses may require a hot standby, where the disaster recovery centre switches over the moment the primary site goes down. This is the most expensive option.

It can be so redundant that data is synchronised between the two sites, leading to minimal disruption in the event of a failure. Others are warm or on standby, which leads to a certain level of delay before the backup site is fully operational. In July 2019, analyst IDC forecast public cloud spending would grow annually by 22.3%, from $229bn in 2019 to nearly $500bn in 2023.

The analyst firm noted that infrastructure-as-a-service (IaaS) spending, comprising servers and storage devices, will be the fastest-growing category of cloud spending, with a five-year compound annual growth rate of 32%.

These figures illustrate that cloud computing is increasingly becoming a mainstay of enterprise IT. Feifei Li, president and senior fellow of database systems at Alibaba Cloud Intelligence, explains: Organisations around the world are embracing a range of disaster recovery solutions to protect themselves against hardware and software failures and ensure zero-downtime for their business applications, which are business-critical but can be costly.

Cloud-native DR provides a cost-effective option for customers to back up data in case of a disaster with a pay-per-use pricing model.

However, one size doesnt fit all when it comes to computing architecture, cloud and disaster recovery. It is not simply a matter of moving cloud-ready business-critical applications and data to the cloud and hoping that the existing storage architecture provides the right set of services to support it.

Enterprise compute environments differ vastly, and there are many environments that, either through technical reasons or data movement regulations, cannot be hosted or backed up in a cloud environment.

While promising attractive IT economics and easy accessibility, Mirani, says: Cloud-based DR services come with their own challenges.

IT companies claim technology is now advanced enough, given the right architecture, for them to offer zero-time recovery. In reality, though, this depends on a number of factors, all of which must be assessed. But first and foremost, organisations must define DR plans that prioritise their mission-critical data and applications, and how much downtime they can sustain before the business begins to suffer.

Recovery point objectives (RPO) and recovery time objectives (RTO) are used in the industry to measure business continuity.

RPO describes how often I need to protect my data to ensure that I can get back to what I had when disaster or data corruption struck or as close to that as possible, says Tony Lock, a principal analyst at Freeform Dynamics.

It relates to how fast my data changes. RTO is how quickly I must make the recovered data available should disaster strike or a request come in from a user, an auditor or even a regulator.

Lock says that answering these fundamental questions is simple enough for small numbers of different data sets. However, it becomes complicated very quickly when you have lots of different data sets of varying business importance, all of which may have very dissimilar protection and recovery needs.

Enterprises may be at different stages of cloud maturity. Some are at the planning stage, while others, that have deployed workloads in the public cloud and are comfortable with how to manage their on-premise and public cloud environments, may be on a path to multicloud and hybrid datacentre transformation.

Whatever the level of cloud maturity, there are good reasons to use the cloud as an emergency backup in the event of a disaster.

In fact, according to the Gartner Magic Quadrant for the backup and disaster recovery market, published in October 2019, backup and recovery providers are consolidating many features such as data replication, (cloud) disaster recovery automation and orchestration, and intercloud data mobility on a single platform.

In addition, Gartners study reported that backup providers are adding data management functionality to their backup platforms to support analytics, test and development, and/or ransomware detection on cloud copies of backup data.

By bundling these additional services, the backup and disaster recovery providers are looking to deliver a higher return on investment in data protection.

Cloud-based disaster recovery services can be hybrid in nature; they move infrastructure and services between the cloud and on-premise datacentres.

Traditional hardware companies now offer capacity-on-demand and flexible consumption-based pricing, providing managed services wrapped around their cloud or pay-as-you-go offerings.

Traditional data backup hardware (tape backup) and software companies are also building out scalable DR cloud platforms, giving their customers a two-or-more-tier approach to their DR.

Architecturally, the customers business continuity and disaster recovery system is located on-premise in the primary and/or secondary datacentre.

Through a managed service offering, the disaster recovery provider manages the backup and replication of data not only locally on the customers hardware, but as part of the service. Copies are sent to the cloud, where a mirror copy of virtual servers, applications and data are kept, ready to spin up in the event of a disaster that takes out the customers primary and/or secondary on-premise systems.

Hyperscale cloud platform providers, such as Amazon Web Services, Microsoft, Google and Alibaba, have security, redundancy and recovery measures in place that make it very unlikely for them to lose your data but they are not infallible, says Bola Rotibi, research director of software development at CCS Insight, who warns that many organisations falsely assume data and information stored in cloud applications and services is safe from loss.

Without a plan that actively addresses protecting critical data stored in the cloud through software-as-a-service solutions in operations, that comfort blanket could just as easily smother an organisation when the light gets turned off, she says.

Beyond the basic requirements of using cloud-based, on-premise or a hybrid approach to DR, there are plenty of tools available that can help make general operational IT and DR run efficiently and cost-effectively. These need to be considered alongside the DR platform choices. Such tools are generally designed to help IT administrators responsible for data and backups understand, visualise and manage the lifecycle of all of the data across the organisation, as well as the relationships between different datasets, systems and people that require access.

With an effective data and information management policy and supporting toolset, enterprises are able to have a better view of what data should remain on-premise or on private cloud platforms, and what data can reside in public cloud systems.

Building on this data governance framework, data backup, replication and recovery policies for disaster can be applied to critical and less critical data and their associated applications.

Along with effective data and information management, data discovery can be used to help an organisation understand and regain control over its data.

Data discovery not only helps to mitigate expensive industry compliance penalty charges, but also enables IT administrators to have better insight into the organisations data. This helps with cost optimisation.

Data discovery lets IT departments see where their data resides across disparate geographic systems or locations, and classify the criticality of that data. The discovery process can check to ensure the data is compliant with legal or regulatory requirements and corporate data governance policies.

While it may not be seen as part of DR, data discovery has its place alongside data retention policies, cyber security and data loss prevention initiatives, as part of a firms data stewardship.

As is the case across many aspects of IT, artificial intelligence (AI) and machine learning (ML) also have their place in DR and business continuity.

Thanks to advances in AI and ML, many routine administration tasks that were previously people-intensive can now be fully automated. Automation, performance, high availability and security are key differentiators when choosing DR solutions, says Li.

For example, many customers prefer virtual machine snapshot backup with high snapshot quota and a flexible automatic task strategy, which helps reduce the impact on business I/O [input/output].

Disaster recovery automation can run tests on applications, for example, or recover data to another environment for testing or development. Typically, AI is used to track metrics relevant to data backup and recovery, such as a performance statistics, rates of change, speed of access, performance insights and bottlenecks.

The AI dynamically makes changes to the DR system if it needs to be optimised, re-prioritised or modified to improve a desired business outcome, such as a service-level agreement to speed up recovery after a system failure.

In essence, the AI ensures the disaster recovery is running optimally and matches the requirements of the business.

Additional reporting by Cliff Saran.

Read this article:
Choosing the right disaster recovery for your business - ComputerWeekly.com

Read More..