Page 3,927«..1020..3,9263,9273,9283,929..3,9403,950..»

The three keys to a successful, data-driven 2020 – Gigabit Magazine – Technology News, Magazine and Website

It wont come as a surprise to anyone that data growth is on the up, but what may be less widely known is that the places where data is being generated are starting to change rapidly. Businesses will start to see how more of their data is being produced in the cloud or the edge, rather than in traditional data centres. The main impact of this will be on how these businesses analyse and understand their data in these locations, and then how they manage it there, without the need to use a data centre as a go-between.

For 2020, there are three main areas that are predicted to experience the most change when it comes to data:

Data at the edge needs data management at the edge

Edge buildout is already happening, and its pace is accelerating with trends like IoT, self-driving cars, and biometrics. IDC predicts that, by 2023, 50% of all enterprise IT infrastructure will be deployed at the edge, up from 10% today. More apps are generating much more data at the edge, raising the question of why data should not be better understood and managed directly at the edge. Imagine if you could analyse data, figure out what data is useful and needs to be brought back to the data centre, and directly process the rest of the data at the edge itself without having to first move it all. This is why edge data management will rise in importance over the next few years.

SEE ALSO:

Cross-cloud data management as a requirement

Most enterprises using the public cloud already have a hybrid or multi-cloud strategy, and enterprises are increasingly choosing to diversify their cloud strategy across two or more clouds, giving customers more freedom and choice. As this trend continues, enterprises need a simple way to understand and manage their data sprawl across clouds and the hybrid enterprise, leading to greater demand for cross-cloud data management solutions that are not tied to any particular cloud or storage vendor.

Analytics-driven AI takes centre stage

For the last couple of years, Artificial Intelligence (AI) and Machine Learning (ML) have been a big theme and this trend is continuing to grow. While these have initially been more of marketing buzzwords, the potential of AI in data management is clear how can you manage something you dont understand? By using analytics to drive AI/ML, analytics driven data management solutions can continue to leverage the understanding of data to drive better management. This year, businesses will see more exciting developments in this space that leverage deep analytics and data lakes across different storage repositories to help them better manage their data.

As the new decade gets well underway, the amount of data produced, stored and analysed continues to rise. Because of this, businesses in every industry will need to work towards being able to manage this data wherever it resides, from a data centre all the way to the edge or even up in the cloud.

By Krishna Subramanian, COO at Komprise

Read this article:
The three keys to a successful, data-driven 2020 - Gigabit Magazine - Technology News, Magazine and Website

Read More..

Miss the look of Windows 7? Change these settings – Komando

Its official. Microsoft is no longer offering any support for Windows 7. Support ended on Jan. 14, 2020.

This means Windows 7 will no longer have support specialists backing it up or updating, and in order to have a safe computer, youll need to upgrade your OS to Windows 10. Tap or click here to learn how to get Windows 10 for free.

But what if you really like the look of Windows 7, and Windows 10 has features you dont want? Well, if youre willing to learn a few tech tricks, we can help you make your computer look and act like it still has Windows 7, starting with the Start menu.

The Windows 10 Start menu introduces a new tile format in addition to the traditional, alphabetical, menu list. If you hate this layout, you can get the Windows 7 style back in two ways.

The first involves the removal of pinned apps. Its not a perfect recreation of Windows 7, but its the simpler way to get the idea of the old Start menu back. It also serves as a great visual transition from 7 to 10.

To unpin applications from the Start menu:

The second way to get Windows 7 is to download the third-party app Open Shell, once known as Classic Shell. This way is much more technical, but itll get you the exact Windows 7 Start menu look you want.

To get and use Open Shell:

If you miss the aesthetic of Windows 7, as well as the Start menu functionality, we recommend learning how to use Open Shell. If you just want a simpler menu thats similar to Windows 7, the unpinning method should do the trick.

File Explorer is only slightly different in Windows 10, but there are still new features like the Ribbon. As above, theres a way to get the idea of the Windows 7 version back, and a way to make things look exactly the same with a third party program.

To get the basic Windows 7 File Explorer structure:

If you hate the ribbon from the Windows 10 File Explorer, and want the Windows 7 look back entirely:

Note: Ribbon Disabler makes a backup of the original File Explorer program files, but also replaces them. Meaning this app affects your computers core programming and should be downloaded with extreme caution.

Windows 10 really pushes Cortana, (its built-in AI helper), its bottom search bar and OneDrive, its cloud storage system. Tap or click here to learn about other great cloud storage programs.

If you have no interest in these new features, you can get them out of your way pretty easily, and this time you can do it without third-party software. Lets start with clearing up your view.

To hide things in the task bar:

If you have a Cortana menu instead of Search, click Hidden to make that button go away, and just deselect Show Task View button to make that disappear.

To disable Cortana, and not just hide her, do the following:

To hide OneDrive:

To disable OneDrive:

So to all the Windows 7 lovers: Youre going to be OK. You can get back more of the classic aesthetic by downloading Windows 7 wallpapers and playing with Open Shell skins.

Updating your OS is important for your computers health. Tap or click here to learn more about why that is. It doesnt mean losing your favorite features though, so long as youre a little tech savvy.

Excerpt from:
Miss the look of Windows 7? Change these settings - Komando

Read More..

Disaster recovery failover choices: Synchronous mirrors, P2V and the cloud – ComputerWeekly.com

Ask a line of business manager about disaster recovery, specifically what data they need to recover and when by, and the answer is likely to be: All of it and now.

As businesses depend increasingly on data to function, how to recover from an IT systems failure becomes increasingly important. Businesses and their customers are less and less tolerant of downtime, and of data loss.

So, IT departments are being forced to look again at how quickly, and how comprehensively, the organisation can get back on its feet.

That pressure has forced IT systems architects and DR teams to work to shorter recovery time objectives (RTOs) and recovery point objectives (RPOs).

In practice, this means organisations backup more data, more frequently, and need to restore it more swiftly. The RPOs and RTOs agreed with the business in turn determine the type of technology selected for recovery and business continuity.

As Phil Goodwin of analysts IDC points out, the trend is for businesses to move towards high availability rather than disaster recovery.

Developments in virtualisation and cloud computing have made that goal more realistic for a wider range of organisations.

The gold standard of business continuity and high availability is synchronous mirroring. For effective disaster recovery, the mirror needs to be offsite in the businesss own secondary or backup datacentre. Or, at a colocation site or a disaster recovery suppliers location, such as 4SL or Sungard AS.

The location will depend on the risks and threats the organisation faces. Physical threats, such as extreme weather or terrorism, will mean the secondary site will need to be further away. But this puts more pressure on network infrastructure, and invariably increases costs. Companies will need high bandwidth, and possibly redundant, links between the two mirrors.

This is the standard for business-critical applications and services, but its not cheap because you need two infrastructures and at least one robust, appropriately-sized link between them, says Barnaby Mote, CEO at managed service provider 4SL.

Synchronous mirroring is the favoured option in industries that have a very low tolerance for downtime and very short recovery time objectives. These include financial services, as well as some areas of government.

Businesses often choose to manage synchronously mirrored datacentres in house, because primary and backup sites must be kept tightly aligned. This pushes up costs and reduces flexibility, as the technology platforms have to stay in sync.

Some businesses opt for a lower performance setup at their secondary sites in order to save money, but this will depend on how performance-sensitive applications are.

IT teams could also save money by only deploying synchronous mirroring for their most critical applications. In practice, organisations tend to lift and replicate entire environments, including all storage and data. This is because of the work needed to separate out critical applications, and the risk that by dividing up the infrastructure something would be missed and cause the copy to fail.

Physical-to-virtual failover is a lower cost and potentially more flexible way of providing real time or near real time backup, and quick restoration.

The most obvious use case is for environments that are already virtualised. Companies have the choice to run in-house tools to replicate VMs, or use a service provider. VMWares vSphere replication offers single and multi-site data protection, as does Microsofts System Center Data Protection Manager.

One advantage of failing over to a virtual environment is to be able to use shared infrastructure for the backup site, to cut costs and reduce management overheads.

Another trend is for suppliers to offer platform-agnostic VM replication services. This allows firms to run backups on alternative infrastructure, and to failover a heterogeneous system to a single backup platform. This also opens up the option to use the cloud for replication.

The model is best suited to organisations already running virtualised environments. Technologies exist for physical to virtual recovery (P2V) restoring an entire backup environment to a virtual machine but backup copies need to be created regularly and moved offsite. Nor is failing over to a virtual environment suitable for companies that need continuous access to their data, such as those trading in financial markets.

On the plus side, most suppliers now provide bare metal recovery for virtual backups, which will help bring the business back online quickly. IT can also store local backups of VMs for quick recovery for example, if local hardware fails at the same time as staging copies to offsite storage for true DR.

The choice of dedicated, or shared, failover environments will again depend on the businesss RPO/RTO requirements and its budgets.

And, while backup to a virtual environment is a good option to save costs and cut complexity, for businesses that run highly-virtualised production environments it can be the only practical option.

Conventional backup and recovery technology does not fit well with virtualised systems because of their shared infrastructure and especially, shared storage. A VM-specific backup system is the best way to avoid bottlenecks and ensure safe recovery of the VMs.

The cloud really extends options for businesses to backup their data, and their production environments.

Suppliers that offer backup services for VMs increasingly offer cloud storage too. Acronis, for example, provides platform-agnostic VM backup, while Microsoft Azure can back up Azure VMs, SQL and businesss local VMWare machines. Products from suppliers such as Veeam and Commvault also support replication to the cloud.

Businesses can save money by only spinning up virtual servers when they invoke their DR plan, but they will of course have to pay for storage.

Recovery into the cloud is feasible primarily for virtual environments; physical machines are problematic, says 4SLs Mote. A range of RTO/RPOs are possible depending on the technology.

Cloud services also work well for smaller businesses that might not have the skills and staff to run backup hardware. Companies can start with simple, online file storage or lower-end data backup services and scale up to application or VM backup as they grow.

The IT used to assist in DR has matured greatly over the course of the past decade, says Freeform Dynamics Tony Lock.

Options exist at hardware level to copy or snapshot and replicate data between similar platforms, sometimes even over distances that previously would have been either impossible or prohibitively expensive. At the same time, software tools have been developed to deliver similar capabilities but between different hardware platforms.

Backing up to hybrid environments is more complex than either straightforward physical-to-physical backup, or replication of an entire (virtualised) environment to the cloud.

On paper, a hybrid approach allows the business to decide which workloads should be mirrored, backed up to a virtual environment, or replicated to the cloud. The challenge is to decide which data and which workloads go where, and to maintain consistency.

Recovery from a hybrid environment will also be more complex. IT teams need to ensure all elements can be recovered from each platform after a full-scale disaster. But they also need a plan to deal with operational recovery, such as deleted files. Data egress charges levied by cloud providers are a cost that is easy to overlook.

Continued here:
Disaster recovery failover choices: Synchronous mirrors, P2V and the cloud - ComputerWeekly.com

Read More..

Alibaba Cloud Named First Public Cloud Vendor in the World to Obtain Trusted Partner Network (TPN) Certification – Business Wire

HANGZHOU, China--(BUSINESS WIRE)--Alibaba Cloud, the data intelligence backbone of Alibaba Group, has today announced that it is the first public cloud vendor in the world to obtain the prestigious Trusted Partner Network (TPN) certification, an achievement that validates the entertainment industrys confidence in its robust security and trustworthiness as a cloud service provider. With viewing consumption habits changing rapidly, broadcast platforms constantly evolving and new production techniques emerging globally, Alibaba Cloud is successfully helping the entertainment industry to revolutionise how it works in order to respond to and embrace these changing dynamics by offering a highly secure, dependable, flexible and scalable cloud-based platform.

The TPN is a joint venture between two major entertainment industry associations; the Motion Picture Association of America (MPAA) and the Content Delivery & Security Association (CDSA). The TPN's goal is to help companies to ensure content security, prevent leaks, breaches, and hacks of movies and TV shows before they are released, by creating a single, central global directory of trusted partner vendors. The certification is significant because it satisfies content producers that Alibaba Clouds use of industry best practices ensures that its solutions, facilities, people and workflows are secure, as certified by experienced industry evaluators.

To secure this highly prestigious accolade - which distinguishes Alibaba Clouds solutions - they had to undergo very stringent auditing and evaluation processes. A number of Alibabas solutions that are suited to the entertainment industry were tested, including: Object Storage Service - an encrypted and secure cloud storage service which stores, processes and accesses massive amounts of data from anywhere in the world; Express Connect - an easy-to-use network service that enables high-bandwidth, reliable, secure, and private connections between different environments; Cloud Storage Gateway - this uses OSS for cloud-based storage at the back end, and supports standard file and block storage protocols in the industry; and Key Management Service this facilitates the creation, deletion and management of encryption keys with Alibaba Cloud Key Management Service. All of Alibabas Clouds audited solutions passed TPNs demanding tests.

"By completing the on-premises TPN audit process successfully, Alibaba Cloud demonstrates its mature abilities in securing media content with its facility and infrastructure capabilities. As the future of media productions is shifting to public cloud platforms, it is essential for vendors like Alibaba Cloud to pioneer innovations that will propel the advancement of the entertainment industry in a digital era," said Drew Branch, Senior Security Consultant at Independent Security Evaluators.

Today, world-class production houses - including Animal Logic and Territory Studio - have already been assured by Alibaba Clouds ability to meet their demands and have embraced its solutions. They are enjoying the benefits of utilising more cloud computing technologies to drive new trends in media production efficiencies, ranging from improving the speed of decision making, easier collaboration between remotely-located artists and developer teams as well as the inherent advantages that come from using Alibaba Cloud in order to drive new industry trends.

Commenting on the certification, Yuanbin Zheng, Head of Security Compliance and Privacy at Alibaba Cloud Intelligence, said: With such high value and sensitive assets to protect, production houses are naturally drawn to the solutions that offer the highest levels of security. Not only does the TPN certification recognise the effort that Alibaba Cloud has made to deliver industry leading levels of security, it also acknowledges the dependability, flexibility and scalability of our cloud-based platform. Furthermore, as the first public cloud vendor to be accredited with the TPN certification, the accolade further reinforces Alibaba Clouds market-leading position as well as its solutions now proven ability to meet the needs of the entertainment industry.

About Alibaba Cloud

Established in 2009, Alibaba Cloud (www.alibabacloud.com), the data intelligence backbone of Alibaba Group, is among the worlds top three IaaS providers, according to Gartner, and the largest provider of public cloud services in China, according to IDC. Alibaba Cloud provides a comprehensive suite of cloud computing services to businesses worldwide, including merchants doing business on Alibaba Group marketplaces, start-ups, corporations and government organisations. Alibaba Cloud is the official Cloud Services Partner of the International Olympic Committee.

Read this article:
Alibaba Cloud Named First Public Cloud Vendor in the World to Obtain Trusted Partner Network (TPN) Certification - Business Wire

Read More..

The best free and open-source alternatives to Google Drive on Android – Android Police

We've already looked at open-source alternatives to several major Google apps and services in this series, but there are still a few categories left to go over. Now it's time to check out the open-source equivalents to Google Drive, the company's cloud storage product.

Thankfully, the feature gap between Google Drive and the alternatives isn't massive all of them have clients for desktop and mobile, easy file sharing, and other features. Depending on what hardware you have on hand, these options might not even cost you anything.

Why does open-source matter?

Free and open-source software (FOSS) has a number of advantages, but to most people, the main benefit is privacy. All the code is out in the open, so anyone with programming knowledge can go through it and see exactly what an app is doing. Proprietary apps can sometimes feel like black boxes, where you don't really know what's going on behind the scenes. That's rarely the case with FOSS.

I say 'almost,' because there's technically nothing stopping open-source apps from spying on you, but that behavior is extremely rare. If a developer is doing something they're not supposed to be, like spying on users or bundling malware, they probably wouldn't announce it to the world.

Many people simply prefer open-source apps out of principle, in the same way that some people prefer shopping at locally-owned stores instead of Walmart or Target. These apps are often created by individuals or small groups in their spare time, as opposed to large companies with income generated from advertising or venture capital.

NextCloud is widely regarded as the golden standard for hosting your own cloud. It goes far beyond simply hosting files there are plugins for adding a task manager, a calendar, collaborative document editing (akin to Google Docs), video conferencing tools, notes, and much more. While the Android app only supports manging files, there are some Android clients for NextCloud plugins. For example, the NextCloud News app allows you to synchronize RSS feeds with the RSS plugin, giving you a self-hosted RSS reader service.

Setting up a NextCloud installation is the tricky part: you either need an always-on PC that you can run the server software from (even a $35 Raspberry Pi will do the job), or you can use a hosting service like Webo, CiviHosting, or Hostio. Unless you have experience with Linux servers, I'd recommend just paying for a hosting service.

As previously mentioned, the NextCloud Android app is primarily for mobile file management. You can upload/download/share files, sync files and folders to your device for offline access, and even auto-upload photos to your server. That last feature makes NextCloud a possible Google Photos replacement too, as long as you have enough storage for all your pictures.

The NextCloud web interface

You can try a demo of the NextCloud web interface here. If you save the temporary username and password to your Chrome passwords (or other password manager), you can also use it to test out the Android app.

I'm including OwnCloud here mostly because I didn't want to have just two options for this article. NextCloud was originally based on OwnCloud, and while NextCloud has flourished in the years since it became a separate project, OwnCloud has seen slower development. Much of the ownCloud development team has moved onto NextCloud, including the original founder, Frank Karlitschek.

OwnCloud functions almost identically to NextCloud: you have to set up your own server (or pay for a hosting service), there are many plugins available, and so on. While there are a few plugins on ownCloud that aren't available on NextCloud, the vast majority of them are enterprise-focused, and not anything you would probably care about.

OwnCloud mostly remains popular because of its existing userbase, but if you're starting fresh, it's probably a better idea to go with NextCloud.

If you really don't want to deal with setting up a server, or paying someone to host a server for you, you might like Syncthing. It's not a cloud service it uses technology based on BitTorrent to synchronize files across all your devices using peer-to-peer data transfer. While this has the advantage of being completely free, you can't share files like you can with NextCloud or OwnCloud.

The Syncthing app is extremely bare-bones: it syncs your files to your phone or tablet... and that's it. All data is saved in the actual Android file system, so you don't have to manually export anything you want to open on your phone. This also makes apps like Moon Reader easier to use, since they can scan the Syncthing folder for new files on their own.

If you don't need the advanced sharing and collaborative access features that NextCloud and OwnCloud offer, Syncthing is a free and simple alternative worth checking out. It's certainly great from a privacy perspective your files never leave your own devices.

More:
The best free and open-source alternatives to Google Drive on Android - Android Police

Read More..

Formulus Black’s Forsa Software Named Finalist for TechTarget’s 2019 Storage Product of the Year – Citybizlist Real Estate

JERSEY CITY, N.J.--(BUSINESS WIRE)--In-Memory Storage and Virtualization innovator Formulus Black today announced that it has been selected as a finalist in the Storage System and Application Software category of the 18th annual Storage magazine and SearchStorage.com Products of the Year awards for its revolutionary Forsa 3.0 software, which enables any application to run cost efficiently in memory without modification.

The 2019 Storage magazine and SearchStorage Products of the Year award recognizes winners in five categories: Backup and Disaster Recovery Hardware, Software and Services; Cloud Storage; Disk and Disk Subsystems; Hyper-converged and Composable Infrastructures; and Storage System and Application Software. All the enterprise storage products were judged based on criteria of performance, innovation, ease of integration, ease of use and manageability, functionality and value. Winners will be announced February 17.

Formulus Black earned its finalist placement with its enterprise-hardened version of Forsa, which sets the bar for enabling larger databases, I/O-hungry HPC jobs, artificial intelligence and machine learning model training to run cost efficiently in memory on commodity hardware.

Along with predictive analytics and AI, finalists in this category were clearly mindful of modern enterprise data storage needs beyond simply meeting high-performance and low-latency requirements, according to the finalist announcement. With 3.0, Formulus Black gave Forsa numerous updates to enable applications to take advantage of memory channel performance. Forsa 3.0 enables any application to run databases and other I/O intensive applications entirely from memory without modification using commodity server hardware.

Forsa software utilizes fast DRAM and Intel Optane DC persistent memory as in-memory storage and a patented BitMarker data encoding algorithm to identify patterns in data and increase the effective storage capacity of memory. Forsa enables applications to dramatically improve data processing speed while protecting against data loss via advanced features such as high availability and BLINK backup and restore. With Forsa, large, multi-terabyte database and analytics workloads can easily persist and run in memory on commodity server hardware benchmark tests show Forsa provisioned persistent memory can deliver up to 2.7x more TPS at roughly 1/4th the latency of NVMe SSDs for mixed read/write database workloads.

While we can extol the virtues of Forsa forever, having a highly respected industry establishment like TechTarget echoing our attributes makes for a much more compelling case for enterprises considering implementing Forsa into their business environment, said Jing Xie, Chief Operating Officer at Formulus Black. Being named a finalist in the Storage magazine and SearchStorage.com Product of the Year awards is a great honor and recognition of all of the hard work by our team here at Formulus Black. We look forward to seeing how we place among our fellow finalists when the awards are given next month.

The complete list of Storage magazine and SearchStorage.com Products of the Year finalists is available at https://searchstorage.techtarget.com/feature/Enterprise-data-storage-2019-Products-of-the-Year-finalists. Additional information about how Forsa supercharges the performance of I/O-intensive applications is available at https://www.formulusblack.com.

Follow Formulus Black

Twitter: https://twitter.com/formulusblack?lang=enLinkedIn: https://www.linkedin.com/company/formulusblack/

About Formulus Black

Formulus Black develops FORSA, a software technology that is unlocking the power of in-memory compute for all applications by enabling server memory to be easily and efficiently used as a high-performance storage media. FORSA can be used to power the most demanding application workloads and for developers seeking to minimize latency, maximize throughput, and scale without performance loss. For more information and to trial our software, please visit: https://www.formulusblack.com

Here is the original post:
Formulus Black's Forsa Software Named Finalist for TechTarget's 2019 Storage Product of the Year - Citybizlist Real Estate

Read More..

In public cloud, what worked at 1PB wont work at 100PB – VentureBeat

Public cloud has served as a catalyst to nearly every successful enterprise. It brought into being a plethora of startups. For small teams with great ideas the public clouds cost model and convenience made it possible to build a business. Public cloud providers unlocked innovations that otherwise would have taken much longer or never would have seen the light of day.

For that, the cloud deserves much credit.

Charging startups only for the storage used, public cloud providers not only enabled their customers growth but powered their own amazing expansion, too. Forrester projects that, even in a supposed slowed growth phase, between 2018 and 2022 the revenue from public cloud infrastructure, platforms, and apps will have a compound annual growth rate of 21%, reaching $411 billion. Its a given these days that some applications are public-cloud exclusive; its what theyre built for and never migrate out of. Enterprises that contributed to this remarkable growth found a way to do more with less by consolidating workloads thanks to moving to the cloud.

The company I work for, Seagate Technology, is among the many beneficiaries. Like many businesses, we used to have data scattered in silos. Operational challenges consumed too much of our staffs time. The heterogeneity was hard to scale. When we first migrated one challenging workload, our Hadoop analytics, to the cloud, we saw a 40% reduction of costs.

But public cloud hasnt turned out to be cloud nine.

While we did see benefits of consolidation in the cloud initially, later on we experienced challenges. The initial euphoria from CapEx reductions gave way to gradual changes in our monthly cloud bill. Because of how we access our data, our total monthly expenses became unpredictable.

As many thriving businesses have learned, the initial comfort public data centers offer tends to dissipate, giving way to bill shock. Once enterprises reach scale, too many CIOs have to walk into their CEOs office with their hat in their hands and ask for an additional 20% to cover the bills. Moves like wanting to migrate some data out from the cloud can incur penalties.

The very things that earned the public cloud its loyal following pricing transparency, predictability of costs, scalability, and latency savings now pose challenges.

Former cloud success stories, Dropbox and Snapchat, are examples of this.

In 2015, Dropbox decided to cut costs by migrating its users onto its own infrastructure and software. The resulting savings amounted to $75 million. Snap Inc., the parent of Snapchat, spent over $1 billion on cloud-computing servers over the last two years and has been attempting to walk back its cloud commitments in order to stop bleeding so much cash.

Meanwhile as more and more data is generated at the edge and 30% of it will soon need processing near where its created the edge is also pulling data away from public cloud. Were entering a multicloud and edge-core world. In this new world, the convenience of public cloud needs to be complemented with predictability in cost, latency, and growth. This is why enterprises are choosing to add private and hybrid clouds into the mix. According to IDC, by 2022, 70% of enterprises will integrate cloud management across their public and private clouds by deploying unified hybrid/multicloud management technologies, tools, and processes.

Not that any of it threatens hyperscale cloud providers. The Data Age is still emergent, bound to generate 175ZB of data by 2025. The proliferation of data guarantees those providers more success. According to Gartner, in 2019 alone, public cloud revenue is set to grow 17.5%. An IDC report predicts that public cloud spending will soar from $229 billion in 2019 to almost $500 billion by 2023.

In spite of this success or perhaps because of it cloud companies have the option to pass benefits on to customers at scale.

On behalf of CIOs and practitioners who are proponents of public cloud, Id like to make a recommendation to cloud providers that will benefit everyone. Why not use this position of scale to help the very customers that the cloud helped mature? Think about it: Data will only grow. It will create more business. Theres enough value, created and shared, to go around. Why not invest in lifelong customers of choice by offering a predictable experience? Instead of pricing tiers and fences that limit the amount of data stored and activated, cloud providers can break open the market with simple, flat, and predictable pricing with one pricing meter only capacity.

Storing hundreds of exabytes of data will unlock new use cases and analytics that help create higher-order revenue streams, which would not exist if the workloads repatriated away from public cloud.

Ravi Naik is Seagate Technologys Chief Information Officer and Senior Vice President of Corporate Strategy.

[Find out about guest-posting for VentureBeat.]

See more here:
In public cloud, what worked at 1PB wont work at 100PB - VentureBeat

Read More..

11 Best Ways to Fix Dropbox Not Connecting or Syncing on Windows 10 Error – Guiding Tech

Dropbox is a popular cloud storage solution that, for the most part, works right out of the box. But like all apps and software, it throws occasional errors and issues. One common error that quite a few users face is Dropbox not connecting or syncing on Windows 10 computers.

Some common reasons could be an internet or Wi-Fi issues, exhausted bandwidth, corrupted file, or lack of storage space in Dropbox. I hope you have checked for these errors before going ahead with the guide below.

Lets begin.

If you are facing network errors, then press Windows key+I to open Settings and search for Find and fix network problems.

Select Apply repairs automatically here and then click on Next to follow on-screen instructions after that. If the system detects any errors, it will recommend steps to fix the same.

If the file you are trying to upload or sync is open in another app, then it wont sync. You will need to close the file before it can sync properly. Try it.

Right-click on the Dropbox icon in the System Tray, click on your profile pic and select Pause syncing option here.

Wait a few moments and then repeat the same steps to click on Resume syncing.

That can help jumpstart the sync process and also solve Dropbox not connecting problem for you.

One of the key features of cloud storage is the ability to share files and folders with others. That allows others to access, view, and edit files while on the move. If someone shared a folder or a file with you, it would be visible in your primary folder. Now, if the folder or file is no longer being shared with you and rights have been revoked, the sync will not work anymore.

Is sync working for other files and folders? Can you sync new files? Is the shared file/folder still visible? If the answer is no to all or even any of these questions, you know what the problem is.

Try uploading the same file to a different Dropbox folder and see if that helps with the sync process. If it works, the folder might be corrupt. I suggest moving everything from that folder to a new one.

A whitespace conflict is a common issue that occurs when a user names one file or folder the same as other except for additional space somewhere in the name. Dropbox will automatically append the name with the words Whitespace Conflict.

But that might not always happen, and you may remain oblivious to the error. I suggest you check for files and folders with duplicate names to make sure none are named identically save for the additional space or whitespace.

While Dropbox notes that leaving the file/folder with the appended name should be okay, several users have faced Dropbox not syncing error on Windows 10 because of it.

Dropbox comes with a selective sync feature where you can choose to sync only select folders, leaving the rest offline on your computer. Maybe thats why the folder or files inside that folder are not syncing?

Right-click on the Dropbox icon in the System Tray again and select Preferences under the profile pic.

Click on the Selective Sync button under the Sync tab.

You can now select the folders or sub-folders that you do want to sync in the pop-up that follows.

Once done, click on Update and then Apply to save changes. Files and folders may take some time before syncing, depending on the size of the file and your internet speed.

There is no sign out option in the Dropbox app on Windows 10. Instead, you will unlink your Dropbox account and then re-link it. Right-click on the Dropbox icon in System Tray and select Preferences again.

Click Unlink This Dropbox button under the Account tab. You will then reboot your computer once and then go back and add your account. It may take some time before everything syncs, so be patient. Check again if Dropbox still not connecting or syncing.

Press Ctrl+Shift+Esc to open the Task Manager and search for Dropbox under the Processes tab.

Right-click on all instances of Dropbox here and select End task. Reboot your computer now and then relaunch Dropbox from the Start menu or desktop shortcut you may have created to begin the process.

Check whichever antivirus that you are using to make sure that Dropbox has not been blocked. There should be a way to whitelist installed Windows 10 apps somewhere in there. In the case of the Windows Firewall or any other app you might be using, you need to configure it to make sure it works with Dropbox.

Uninstalling Dropbox wont delete your files anywhere. Neither in the cloud nor on your computer, so don't worry. Search for and open Control Panel from the Start menu.

Search for and select Uninstall a program in the search bar.

Find Dropbox here and right-click to select the Uninstall option here.

Download the latest stable version of Dropbox from the link below and install it. You will need to sign in and set it up before sync begins again.

If Dropbox is not connecting or syncing on Windows 10 still, the support staff has shared a detailed advanced reinstall process. It is aimed at a clean install so that old files and other cache data wont result in the same errors as before.

Download Dropbox

Dropbox is an awesome cloud storage solution, but it is not the only one out there. There are others like it that you can try. If Dropbox is giving you trouble, and you can't resolve it, I would recommend trying Google Drive, OneDrive, and Box cloud storage platforms.

Next up:Click on the link below to learn which is a better solution between Dropbox and Google Photos to store your precious memories.

Last updated on 23 Jan, 2020

Read the original post:
11 Best Ways to Fix Dropbox Not Connecting or Syncing on Windows 10 Error - Guiding Tech

Read More..

What is AWS S3 and 5Ws for using it? – WhaTech

What is AWS S3 bucket and Why to use it?

AWS S3 is an object-based serverless storage service by Amazon web services which is much faster than hard drive file systems and block storage approaches to save data. Serverless means the storage is hosted on the cloud where you dont have to configure the server with storage space restriction, it gets expanded dynamically with usage.

What is the AWS S3 bucket?

AWS S3 bucket is a public cloud storage unit on S3 (Simple storage service). The user account can hold multiple S3 buckets for storing folders and data in the form of objects but the bucket names should be unique across all AWS accounts just like a domain name.

The S3 bucket names should be DNS compliant which means it shouldnt include special characters in its name.What is AWS S3 and Why to use it?

Why use AWS S3? Top 10 Features of AWS S3

Here we will discuss the Top 10 features of AWS S3.

1. Security

A. Security on Server Side

For server-side security Server-side encryption is used which has the following 3 options:

In this feature, S3 will use AES-256 encryption algorithm to secure the data and handles the keys itself.

In this feature, S3 will use AES-256 encryption algorithm to secure the data and use envelope key management service to encrypt the keys which will allow you to manage keys on your own.

In this feature, S3 will use AES-256 encryption algorithm to secure the data and customer provides the keys (you manage the keys).

B. Security in transit

By default, SSL encryption is used for in-transit data and all HTTP requests.

C. Security on Client Side

The data is first encrypted on client-side and then uploaded to AWS S3.

2. Lifecycle management

Lifecycle management is a service to automatically manage data objects after living up for a predetermined life cycle. The set of rules written in life cycle management can automatically delete or move the targeted data to a different storage class after a determined time period.

3. Versioning

Versioning is used to maintain versions of data and to record the actions done by users over it. Versioning is disabled by default; the root user can enable it.

Once you have enabled the versioning it can only be suspended which means the created versions will not be deleted.

4. MFA

For prohibiting others on a development team to delete data from S3 bucket you can enable MFA tokens but in order to do this versioning should be turned on mandatorily. Enabling MFA token will allow the only root user to delete data from S3 buckets on successfully matching the token.

5. ACL

ACL is a simple permission template or legacy method to manage permissions over objects and S3 buckets.

6. Bucket Policies

Bucket policies are JSON documents which allow developers to write thorough control access procedures.

7. Cross-Region Replication

Cross-region replication is replicating the data present in one data centre to another data centre situated at a different geographical location. The replication of data can be done across accounts as well as S3 buckets.

In case of natural calamities, the software solution will not shut down it will start fetching data from the data centre located in a different region.

Although AWS S3 stores your data across multiple geographically distant Availability Zones by default compliance requirements might dictate that you store data at even greater distances. Cross-Region replication allows you to replicate data between distant AWS Regions to meet compliance requirements.

If your customers are in two geographic locations, you can minimize latency in accessing objects by maintaining object copies in AWS Regions that are geographically closer to your users.

If you have computed clusters in two different AWS Regions that analyse the same set of objects, you might choose to maintain object copies in those Regions.

8. Transfer Acceleration

AWS S3 Transfer Acceleration enables fast, easy, and secure transfers of files over long distances between your client machine and an S3 bucket. Transfer Acceleration takes advantage of AWS CloudFronts globally distributed edge locations.

As the data arrives at an edge location, data is routed to Amazon S3 over an optimized network path. Using Transfer Acceleration, additional data transfer charges may apply.

Only the S3 bucket owners can enable transfer acceleration to leverage maximum bandwidth capabilities of their internet connection for frequently uploading Gigabytes to Terabytes of data.

9. Pre signed URLs

The data uploaded as objects on AWS S3 bucket generates unique URL to access it and it is accessible to people according to access level permissions (Public, Private or limited access). When the AWS user wants to provide read and write access to someone over an object for a limited time then they can create pre-signed URLs which will be signed by their user id and will provide access for the predetermined time period.

10. Storage Classes

AWS S3 has the following six storage classes for which the availability is inversely proportional to pricing.

Standard storage class is fastest and most expensive as the data in it is replicated across at least three availability zones. This storage class is best for storing data that is being accessed almost every time because here the latency is in a couple of microseconds.

The standard IA are the same as standard storage class in terms of performance but the bundled services are lesser hence it is cheaper.

In One Zone IA, the objects are only stored in one availability zone to reduce the asking price hence the latency is little more than the standard storage class. The data objects which are less frequently used like once in a month should be stored in this storage class.

The data thats older than a month and which is hardly accessed by anyone should be moved to Glacier for reducing the storage cost to a fraction.

Glacier archive is used to store data that needs to be stored for a year or more. Usually, this type of data is enterprise operations data or the data to be maintained for legal compliance.

The Glacier archive is cheapest amongst all of its peer storage classes and the data retrieval time is in hours.

Intelligent tiering uses machine learning to analyse the objects to be placed in most cost-efficient storage class. The least accessed objects tend to be moved into glacier or glacier archive.

For more insights, you can refer to the performance chartby AWS.

Who should use AWS S3?

The Solutions architect incorporates AWS S3 in solution architecture and on deployment he/she directs the DevOps team to use it for storing the data.

When to use AWS S3?

When your project has a large amount of data which is increasing at an unpredictable rate.

Where to use AWS S3?

A project where large amounts of sensitive data are being generated and accessed should use AWS S3 to reliably manage access over data and protect it. Usually, the scale of these projects is enterprise-level which cannot bear downtime.

This email address is being protected from spambots. You need JavaScript enabled to view it.

For more information:

Read the rest here:
What is AWS S3 and 5Ws for using it? - WhaTech

Read More..

The Future of Edge Computing: Beyond IoT – Datamation

Register for this live video webcast - Thursday, January 30, 10:00 AM PT. Ask the expert - get your questions about vendor relationships answered by industry leaders.

Edge computing offers enormous promise some say it may even supplant cloud computing. Certainly this emerging technology, in which sensors across the Web provide a torrent of data, is growing rapidly. A research report in August 2019 forecast a blistering 32% CAGR yearly increase between now and 2023, meaning the edge market will double in size.

Edge computing fuels many of the tech trends that are getting buzz today, including smart factories, smart grids, connected vehicles and more. While IoT has driven edge computing, the technology fueled by 5G will play an ever greater role in many sectors beyond IoT.

To provide insight into the future of this key technology, Ill speak with a leading expert, Bryan Beal, Senior Director, Strategy and Solution Innovation at VMware Telco and Edge Cloud group

Bryan Beal, Senior Director, Strategy and Solution Innovation at VMware

James Maguire, Managing Editor, Datamation moderator

Bring your questions to this live video webcast well answer as many as we can.

Register for this live video webcast - Thursday, January 30, 10:00 AM PT

In this webcast you will learn:

Register for this live video webcast - Thursday, January 30, 10:00 AM PT

Get your data questions answered by leading experts.

How to Get Started with Artificial Intelligence

FEATURE|ByJames Maguire, January 13, 2020

Quantum Computing: The Biggest Announcement from CES

ARTIFICIAL INTELLIGENCE|ByRob Enderle, January 10, 2020

The Artificial Intelligence Index: AI Hiring, Data, Trends

FEATURE|ByJames Maguire, January 07, 2020

Artificial Intelligence in 2020: Urgency and Pragmatism

ARTIFICIAL INTELLIGENCE|ByJames Maguire, December 20, 2019

Intel Buys Habana And Gets Serious About Deep Learning AI

FEATURE|ByRob Enderle, December 17, 2019

Qualcomm And Rethinking the PC And Smartphone

ARTIFICIAL INTELLIGENCE|ByRob Enderle, December 06, 2019

Machine Learning in 2020

FEATURE|ByJames Maguire, December 06, 2019

Three Tactics Hi-Tech Companies Can Leverage to Drive Growth

FEATURE|ByGuest Author, November 11, 2019

Could IBM Watson Fix Facebook's 'Truth Problem'?

ARTIFICIAL INTELLIGENCE|ByRob Enderle, November 04, 2019

How Artificial Intelligence is Changing Healthcare

ARTIFICIAL INTELLIGENCE|ByJames Maguire, October 09, 2019

Artificial Intelligence Trends: Expert Insight on AI and ML Trends

ARTIFICIAL INTELLIGENCE|ByJames Maguire, September 17, 2019

12 Examples of Artificial Intelligence: AI Powers Business

FEATURE|ByJames Maguire, September 13, 2019

Top 8 Artificial Intelligence Software

FEATURE|ByCynthia Harvey, August 30, 2019

Artificial Intelligence Jobs in 2019

ARTIFICIAL INTELLIGENCE|ByLisa Morgan, July 19, 2019

What is Artificial Intelligence?

ARTIFICIAL INTELLIGENCE|BySamuel Greengard, May 24, 2019

Top 45 Artificial Intelligence Companies

|ByAndy Patrizio, May 24, 2019

AI vs. Machine Learning vs. Deep Learning

ARTIFICIAL INTELLIGENCE|ByCynthia Harvey, May 16, 2019

Artificial Intelligence in Healthcare: How AI Shapes Medicine

ARTIFICIAL INTELLIGENCE|ByLisa Morgan, March 08, 2019

Top Machine Learning Solutions

FEATURE|BySamuel Greengard, February 14, 2019

Google Machine Learning Engine: Product Overview and Insight

ARTIFICIAL INTELLIGENCE|BySamuel Greengard, February 14, 2019

See the original post here:
The Future of Edge Computing: Beyond IoT - Datamation

Read More..