Category Archives: Cloud Storage

Differing organizational views can complicate the cloud journey – Tech Wire Asia

The cloud journey continues to see new challenges as businesses keep getting the most out of it. What started as technology to enhance storage capabilities has now enabled organizations to perform beyond their expectations. Be it a cloud-native business or a 100-year-old organization that has moved operations online, the cloud has delivered what it was designed to do.

However, with greater needs from an organization, especially in meeting customer demands, its all about speed today. For most organizations, the cloud has to process workloads that can improve productivity, enhance business offerings, and also enable recovery from disruption. Even businesses that took a cautious approach in their cloud journey, found themselves moving faster to the cloud, especially during the COVID-19 pandemic.

Today, any organization that wants to remain relevant in the digital world would be making the most of the data available to them. And the best way of doing that is by leveraging the many capabilities offered by the cloud.

But heres where it can get tricky. While financial and regulated industries need to take a cautious approach to the cloud, most organizations were investing in the cloud without a proper strategy in place. The public cloud, which most organizations rely on, is provided by big tech companies.

The public cloud was the most economical approach to digital transformation, but there were limitations when it came to meeting regulatory requirements for some industries. The private cloud-enabled these industries to make the most of the cloud capabilities on-premises. Yet, even this wasnt enough for organizations.

This is where the multi-cloud comes in. Using a variety of cloud services from different providers, businesses could spread out their workloads and data, and worry less about regulatory and also cybersecurity issues.

Today, apart from providing their own cloud capabilities, cloud vendors also insist that they have no issues with organizations using more than one cloud provider for their workloads, app development, storage, backup and such.

However, according to the 2023 Cloud Complexity Report by NetApp, 98% of senior IT leaders have been impacted by increasing cloud complexity in some capacity, potentially leading to poor IT performance, loss in revenue and barriers to business growth. The findings are based on a global survey by NetApp on how technology decision-makers are navigating cloud requirements coming from digital transformation and AI initiatives and the complexity of multi-cloud environments.

In APAC, the top business impacts due to increasing complexity of data across their cloud environments are increased skepticism over cloud from leadership (47%), staff not taking full advantage of business applications (47%), increased cybersecurity risk (45%), and lack of visibility into business operations (41%). (Source NetApp)

Ronen Schwartz, Senior Vice President and General Manager, Cloud Storage at NetApp commented, as cloud adoption accelerates and businesses innovate faster to compete, technology leaders are facing growing pressure to juggle multiple priorities at once causing many to rethink how they manage efficiency and security in this new environment.

Our global survey data demonstrates the extreme complexity of modern IT environments, and the pressure technology executives are under to show measurable outcomes from cloud investments. At NetApp, weve simplified the complex through our approach, which enables technology executives to increase the speed of innovation, lower costs and improve consistency, flexibility and agility across on-premises and cloud environments, said Gabie Boko, Chief Marketing Officer, NetApp.

When it comes to data, its exponential growth has led to complexity for organizations globally. The survey showed that tech executives are feeling the pressure to contain its impact on the business. In Asia Pacific (APAC), the top business impacts due to the increasing complexity of data across their cloud environments are increased skepticism over the cloud from leadership (47%), staff not taking full advantage of business applications (47%), increased cybersecurity risk (45%), and lack of visibility into business operations (41%).

Looking at the cloud strategy, ESG has become an unexpected cloud driver. Nearly half of the tech executives (49%) also stated that when cloud strategy discussions happen, cost concerns come up often or all the time. Data regulation and compliance is another cloud driver, with various local regulations promoting their multi-cloud strategy most or some of the time.

In APAC, 86% of tech executives are already expected to show results across the organization. The pressure to already show ROI on cloud investment is highest in India and Singapore, where 9 out of 10 tech executives feel it. Furthermore, 80% of executives in APAC say cloud systems are developed with sustainability goals specifically in mind. Within the region, Singapore (72%) and Japan (69%) lead in featuring the cloud prominently in their sustainability strategy. Three out of four tech (75%) APAC executives also say their multi-cloud strategy is driven by data sovereignty requirements.

(Source Shutterstock)

AI-driven applications are adding more complexities to the cloud journey as well. In the next year, over a third (37%) of tech executives report that half or more of their cloud deployments will be supported by AI-driven applications. Nearly half of the tech executives at smaller companies those with fewer than 250 employees expect to reach the 50% mark in the next year, and 63% by 2030, while larger companies lag.

Globally, the report shows that the U.S. leads EMEA and APAC on plans to deploy AI-driven cloud applications in the next year, with France and Japan as outliers in their regions. In APAC, 56% of tech executives report that half or more of their cloud deployments will be supported by AI-driven applications by 2030. This presents a long-term growth opportunity for AI-driven applications in the region.

APAC leaders today recognize the clouds importance in producing critical business outcomes such as data sovereignty and sustainability. By addressing the cloud complexity confronting their organizations, they can unlock the best of the cloud and innovate faster to compete, commented Matthew Swinbourne, CTO of Cloud Architecture at NetApp Asia Pacific.

As organizations increasingly move to multi-cloud environments, NetApp aims to alleviate efficiency bottlenecks by allowing IT leaders to manage their systems on one, streamlined user interface. By taking an evolved cloud approach, NetApp is leading the charge for next-generation cloud management and storage equipping teams with the tools necessary to stay abreast of the key trends outlined in the research report above (e.g., optimizing costs, assessing risks, and operating sustainably).

Aaron Raj

Aaron enjoys writing about enterprise technology in the region. He has attended and covered many local and international tech expos, events and forums, speaking to some of the biggest tech personalities in the industry. With over a decade of experience in the media, Aaron previously worked on politics, business, sports and entertainment news.

See more here:
Differing organizational views can complicate the cloud journey - Tech Wire Asia

A Tech Expert Tells Us What To Do Immediately After Getting The Low Storage Notification On Your iPhone – SheFinds

April 3, 2023 by Lisa Cupido

Getting a low storage notification on your iPhone can send you into a frenzy, trying to figure out what to do first to free up storage and prevent your phone from losing even more storage. One of the best things you can do is act fast and there are a number of helpful tips that are super simple to follow and will make a big difference in your storage capacity in no time. Tech Expert Austin Farrington, vice president of content strategy and partnerships atReloft, tells us what to do immediately after getting the low storage notification on your iPhone. Additionally, heres how you can prevent losing more storage in the future.

The best thing you can do is prevent your phone from losing storage in the first place. Here are ways you can prevent low storage issues in the future:

Following these steps can help ensure you avoid low storage notifications while also dealing with them swiftly and effectively when you get that pop-up.

Originally posted here:
A Tech Expert Tells Us What To Do Immediately After Getting The Low Storage Notification On Your iPhone - SheFinds

Protecting data in the cloud: Who’s responsible? – Gulf Business

Is cloud storage more secure than on-premise?

While cloud storage is becoming an increasingly popular option for businesses today, its not necessarily more or less secure than on-premise. Rather, when it comes to cloud security versus on-premise, the answer is that it is highly dependent on how the data is being managed, encrypted and safeguarded.

So the most significant difference when it comes to cloud storage is for a business to ensure they have full disclosure over precisely what their cloud provider is protecting for them, and then go a step further to ensure the encryption and security measures are in place to safeguard them in case the first layer of protection fails.

Contrary to popular belief, the cloud provider will not always provide the business with all the necessary tools to combat cyberattacks. It is up to the company to read the fine print to ensure they know what theyre getting from their cloud service provider (CSP).

As part of their usual agreements, CSPs generally only offer guarantees for their provided services, but not always for the customers data protection when on their servers. And its not just an issue of lost data that employees mistakenly lay at the door of providers.

The assumption that cloud providers protect cloud data from ransomware attacks is potentially even more harmful. This is fundamentally incorrect and will continue to put businesses at risk.

The obvious solution is for a business to understand its CSP agreements and put measures in place that will help to restore mission-critical data if its lost or damaged.

Multi-cloud environments are becoming increasingly complex to manage and businesses today have the added responsibility of safeguarding their own data within them, regardless of the CSP agreements in place.

The best way to ensure widespread protection of data is by asking for a little bit of help from experts in the field thats why we highly recommend businesses work with third parties who have the tools and expertise to automate security measure and relieve IT departments from the hard work.

In addition to deploying tools to help manage and automate protection, there are also a number of ways businesses can help maintain their data protection.

For example, having employees use a strong password or passwordless solution is an important way to increase the standard of protection within the business. Dont be predictable when creating passwords, because the easier your password is to guess or crack with brute force, the faster a hacker can access your accounts.

This goes for encryption as well, which basically sets a password for all your most valuable files so that even if hackers get into your database, they wont be able to access individual datasets.

Finally, two-factor verification is another way enhance your protection posture, as well as setting up user permissions to access specific data based on your organisational hierarchy. This ensures that staff only have access to the information they need to perform their jobs.

Read: Nearly 60% of UAE, Saudi firms need to boost cybersecurity spending, reveals report

See the rest here:
Protecting data in the cloud: Who's responsible? - Gulf Business

IDrive e2 is Offering Hot S3 Compatible Object Storage with … – PR Newswire

LOS ANGELES, March 31, 2023 /PRNewswire/ -- IDrive e2, a high-performance and scalable S3 compatible object storage service, has improved performance for Rclone, allowing users to easily backup their data to IDrive e2 using Rclone to save on costs while offering optimal performance.

Moving data to a cloud storage service can be expensive, which is why leveraging Rclone with IDrive e2 is a great way to make this process more affordable, while also allowing users to backup large volumes of data at a much lower price point than the alternatives. Users can also easily automate their backups so their data is always up to date, without the need for manual intervention.

With the increasing demand for IDrive e2, the company recognizes the importance of giving all types of users the best possible performance, thus enhancing the service to give them the fastest backup possible. By using IDrive e2 with Rclone, users can now benefit from faster backup speeds when using the e2 service.

Using Rclone to backup data to IDrive e2 is simple to set-up, with very detailed instructions on the IDrive e2 website. Users simply have to:

In addition to enhanced performance, IDrive e2 also offers a range of other advanced features, including support for multiple cloud storage platforms, file versioning for ransomware protection, and flexible scheduling options.

With edge locations in Montreal, Ireland, London, Frankfurt, Madrid and Paris, along with 8 locations across the United States, Rclone users are able to point their data to the edge center closest to them for a faster network and ease of access, as well as retrieve their data quickly based on their convenience. This provides the fastest possible response time for all S3 API calls no matter what region the user is located in.

IDrive e2can store petabytes of data and users can retrieve their data quickly based on their convenience using associated access key ID and a secret access key. Enterprise users can access data directly from the IDrive e2 web console or via third-party tools like MSP360 or Duplicati etc.

When it comes to affordability, IDrive e2 is the clear winner above all other S3 compatible storage solutions by offering annual plans that start as low as $40/year with zero egress fees. Monthly plans are also more affordable with IDrive e2, starting at $.004/GB/month, IDrive e2 also regularly offers new users 90% off their first year.

About IDrive

IDrive Inc. is a privately held company specializing in cloud storage, online backup, file sharing, remote access, compliance and related technologies. Core services include IDrive, IDrive e2, RemotePC and IBackup.The company's services help over 4 million customers back up over 500 Petabytes of data.

SOURCE IDrive Inc.

Originally posted here:
IDrive e2 is Offering Hot S3 Compatible Object Storage with ... - PR Newswire

NetApp Quantifies The Harsh Realities Of Cloud Adoption – Forbes

Netapp

Gado via Getty Images

Im a sucker for a good survey, especially one that goes deep into the concerns of IT practitioners. In the past month, I've written about Palo Alto Networks' 2023 State of Cloud-Native Security Report and Komprise's 2023 State of Unstructured Data Management Report. I do this because I learn from them, and hope you do too.

A well-fielded survey can reveal truths about the market that you wont find otherwise. Sometimes youll find that it validates assumptions you already hold, but other times a good survey will shine a light on something completely new. NetApps just-released 2023 Cloud Complexity Report does both.

IT organizations are asked to do more with less in our current challenging macroeconomic environment. As a result, demands on IT resources are increasing, all while staffing and infrastructure investment must be fought for.

In addition, after more than a decade of enterprise adoption, IT organizations are coming to terms with the inherent complexity in deploying to the public cloud. NetApp's survey reflects this new reality.

I like that NetApp includes the word complexity in its title because that truly reflects the challenges of managing IT resources across a multi-cloud infrastructure. NetApps survey revealed that 98% of global IT executives report that their organizations have been impacted by the increasing complexity of managing data across the cloud. This may seem obvious, but the survey goes a level deeper. This is where some surprises surface.

Factoring into the complexity facing IT organizations are the usual suspects of increased cybersecurity risks, budget concerns, and staffing issues. Most surprisingly, 44% of survey respondents noted increased skepticism over cloud adoption from leadership. This was coupled with 44% of respondents saying that staff isnt taking full advantage of business applications in the cloud.

I was fortunate to speak to Ronen Schwartz, NetApps Senior Vice President and General Manager of NetApps Cloud Storage business. I talked to Mr. Schwartz just before the report was released, and I asked him what, in the results, surprised him most.

Ronen Schwartz told me his biggest surprise is that short-term return on investment is now a top-tier concern for IT administrators. This is a shift from recent years, where the top driver revolved around agility and solving complexity. But times have changed.

According to NetApp, 84% of technology executives are already expected to show returns on cloud investment or are under pressure to deliver short-term progress. This is coupled with the 76% of respondents who indicated that their business leaders are "somewhat to extremely skeptical" of seeing cost savings.

NetApp went a layer deeper and showed that while pressure to show RIO is common (33%), it's highest where business leaders are most skeptical (44%). Adding additional color, the survey indicates that pressure over higher ROI tends to come more from director-level management, who are closer to the day-to-day operations, than from C-level executives.

While management is increasingly skeptical about cloud adoption and is demanding a faster ROI, cloud adoption continues to grow. The agility and flexibility the public cloud offers is compelling, especially compared to CapEx-driven on-prem infrastructure. When looking at why organizations deploy to the cloud in 2023, NetApp's survey yielded several expected answers but also one that surprised me.

Key Drivers for Cloud Investment

NetApp

Scaling artificial intelligence and automation s is the top-ranked business need most likely to drive cloud investment decisions in 2023. This sits above well-known drivers such as regulatory compliance, data security, ESG, budget, and operational complexity. Nearly half of technology executives surveyed in the United States report that "half or more" of their cloud deployments will be supported by AI-driven applications over the coming year.

AI is seen as critical to delivering multiple business benefits. According to this survey, the top benefits are using AI to achieve greater security and risk assessment, improved customer experience, and increased production rate. This survey was fielded before the current excitement around large language models, such as ChatGPT, but my guess is that LLMs deliver these same benefits.

It wasnt that long ago that the public cloud was seen by IT leadership as a path for reinvention. It promised a flexible consumption model, paid for easy-to-budget OpEx. Everything was going to move to the cloud, all of us technology analysts promised.

NetApps survey shows that corporate leadership is recognizing that the reality of the public cloud is far messier. This is good news for IT, as it indicates that conversations about cloud adoption are now grounded in reality.

It's also good news for companies like NetApp, whose offerings remove some of the complexity inherent in public cloud deployments. NetApp's cloud business brings NetApp's data and storage management solutions to the top public cloud providers. So whether we're talking about enterprise cloud storage with NetApp's ONTAP cloud volumes, NetApp BlueXP for managing the hybrid-cloud experience with a unified control plane, or its multiple data protection and cyber-resilience technologies, NetApp has a strong play in the public cloud.

The market is responding. In its latest earnings, NetApp reported that is public cloud ARR is now $605M, up 29% from the prior year. Actual revenue from the public cloud for the quarter was $150M, up from $110M. The company's cloud partnerships only continue to grow.

The multi-cloud environment is only becoming more complex. Consumption-based on-prem offerings such as Hewlett Packard Enterprise's GreenLake and Dell Technologies APEX are increasingly challenging public cloud. In addition, NetApp has its own storage-as-a-service, NetApp Keystone. These solutions co-exist with the public cloud, a key element in nearly every enterprise's infrastructure.

Understanding the complexity of multi-cloud is paramount to solving the problems of working IT practitioners. NetApp's survey demonstrates that the company is working to understand these needs better. This will only make NetApp's solutions better.

Disclosure: Steve McDowell is an industry analyst, and NAND Research an industry analyst firm, that engages in, or has engaged in, research, analysis, and advisory services with many technology companies, which may include those mentioned in this article. Mr. McDowell does not hold any equity positions with any company mentioned in this article.

Steve McDowell is principal analyst and founding partner at NAND Research.Steve is a technologist with over 25 years of deep industry experience in a variety of strategy, engineering, and strategic marketing roles, all with the unifying theme of delivering innovative technologies into the enterprise infrastructure market.

Read more here:
NetApp Quantifies The Harsh Realities Of Cloud Adoption - Forbes

World Backup Day: Peace of mind is a backup away – The Economic Times

Have you ever lost a precious photo or document due to a computer crash or accidental deletion? It's a frustrating and sometimes devastating experience, especially in today's digital age where our lives are intertwined with technology. That's why World Backup Day, which is observed on March 31st, is such an important reminder to safeguard our data and celebrate the peace of mind that comes with secure backups.When it comes to backing up your data, there are several options to choose from, each with its own benefits and other considerations. Let's dive into these top ways to safely back up your precious data and explore which solution might be the best fit for your needs.

Cloud storageCloud Storage is an excellent option for those who prioritise convenience and accessibility. With cloud storage services like Google Drive, Dropbox, and OneDrive, you can access your data from any device with an internet connection, making it a perfect solution for those who work on-the-go. Cloud storage is also secure, with many providers offering encryption and two-factor authentication for added protection.

Hybrid backup solutionsHybrid backup solutions offer the benefits of both cloud storage and external hard drives, providing an extra layer of protection against data loss. With hybrid solutions, your data is automatically backed up to both the cloud and an external hard drive, providing redundancy and added security. Hybrid solutions are also scalable, allowing you to adjust your storage needs as your data grows.

In conclusion, there are several ways to back up your precious data, each with its own benefits and other considerations. By choosing a solution that fits your needs and implementing it consistently, you can celebrate World Backup Day with peace of mind and the knowledge that your data is safe and secure. Don't wait until it's too late take action and back up your data today!

Link:
World Backup Day: Peace of mind is a backup away - The Economic Times

IPFS Review: How Is Decentralized Data Sharing Better than the … – Cryptopolitan

In recent years, the rise of blockchain technology has brought a lot of new ideas to the forefront of the tech industry. One such innovation is the InterPlanetary File System (IPFS), a distributed file system that aims to provide access to the internet through decentralized data sharing. With the increasing need for decentralized and secure data sharing, many experts are now touting IPFS as a potential alternative to the traditional Hypertext Transfer Protocol (HTTP). In this article, well explore what IPFS is, how it works, and whether it has the potential to replace HTTP as the dominant protocol for content sharing on the web.

HTTP, or Hypertext Transfer Protocol, is the primary protocol used for transferring data over Web2. It is a request-response protocol, which means that a client (such as a web browser) sends a request to a server, and the server responds with a message containing the requested data. The data exchanged over HTTP typically consists of text, images, videos, and other types of media.

When a client makes a request to a server, the request is typically made using a URL, or Uniform Resource Locator, which specifies the location of the resource that the client is requesting. The server responds to the request by sending back a message containing the requested resource, which is usually a HTML document that the browser can render to display the content to the user.

HTTP is built on top of the TCP/IP protocol stack, which is a set of protocols used for communication over the Internet. It operates at the application layer, which is the highest layer in the protocol stack. This allows it to interact with other protocols in the lower layers, such as TCP (Transmission Control Protocol) and IP (Internet Protocol), to ensure reliable and efficient data transfer.

While HTTP has been the standard protocol for data exchange on the Web for many years, it has some limitations. One of the main limitations is that it relies on a client-server architecture, which can lead to issues with scalability and reliability. Additionally, HTTP is not well-suited for distributing large files or handling large volumes of data.

IPFS is a decentralized modular protocol designed to revolutionize the way data is organized and shared on the internet. It is a modular suite of protocols built with content addressing and peer-to-peer networking in mind. IPFS has multiple implementations since it is an open-source project. Its main use case is to publish data such as files, directories, and websites in a decentralized manner, and it has many potential applications in the field of distributed systems.

Launched by Protocol Labs, IPFS allows any computer worldwide to download its software and start hosting and serving files. Once a file is uploaded to the IPFS network, it can be viewed and downloaded by anyone else running IPFS. However, it is important to note that IPFS is not a storage or cloud service provider, even though it can be deployed on cloud infrastructure. Instead, it is a protocol that facilitates the storage and distribution of data in a decentralized manner.

How does IPFS work?

IPFS comprises several subsystems responsible for representing, routing, and transferring data. While these are the key responsibilities, IPFS functionality extends beyond these three. IPFS addresses data by its contents, a concept referred to as content addressing, rather than location addressing such as an IP address.

Data in IPFS is represented as content-addressed blocks, and the system operates on these data blocks using subsystems such as Content Identifier (CID), InterPlanetary Linked Data (IPLD), and Content Addressable aRchive (CAR) files. These subsystems are responsible for addressing and representing data on IPFS, and they ensure the efficient routing and transferring of data between nodes. The CID subsystem provides unique identifiers for each block of data, while IPLD ensures the interoperability of data across different systems, and the CAR files subsystem is responsible for creating portable archives of data for easy transfer.

Protocol Labs has developed two complementary protocols, IPFS and Filecoin. IPFS is designed to allow peers to store, transfer and request verifiable data, while Filecoin provides a persistent data storage system with an incentive layer. Users pay to store their files on storage providers systems and providers are rewarded for continuously storing data and cryptographically proving it. These two protocols can be used separately or together and may have applications in blockchain development, crypto mining, and NFT ownership, among other areas.

Filecoin uses its own cryptocurrency and digital payment system to ensure that files are stored reliably over time. Users pay for storage and storage providers can be anyone who is willing to store files and prove they have stored them correctly over time. The Filecoin protocol uses a blockchain and its own cryptocurrency to incentivize storage providers. IPFS, on the other hand, allows for content addressing and movement, while Filecoin provides an incentive layer for data persistence. While both protocols were developed by Protocol Labs and are complementary, they are also separable, and IPFS already supports more self-organized or altruistic forms of data persistence through tools like IPFS Cluster.

HTTP uses URLs (Uniform Resource Locators) to address content. URLs point to the location of the content on the internet, which means that if the content is moved or deleted, the URL becomes invalid. IPFS uses content-based addressing, which means that content is addressed based on its content hash, rather than its location. This makes IPFS content immutable and permanent, even if the original node that shared it goes offline..

HTTP relies on a centralized server-client architecture where content is stored on a central server and clients request data from that server. This makes HTTP vulnerable to single point of failure and censorship. IPFS, on the other hand, is a decentralized protocol that allows for peer-to-peer communication and storage. IPFS nodes share and serve content with each other, making it resistant to censorship and failure.

HTTPs caching mechanism is based on the assumption that the content requested will remain the same for a certain period of time. This assumption can lead to outdated content being served to users. On the other hand, IPFS uses a distributed hash table to store content, which means that frequently accessed content is stored in multiple locations, reducing the risk of outdated content being served.

HTTP requires the entire content to be transferred for each request, even if the content has not changed since the last request. This can result in a waste of bandwidth. IPFS, on the other hand, uses a content-addressed system, where each piece of content is assigned a unique hash. When a request is made for a piece of content, only that specific content is transferred, reducing the bandwidth required.

HTTP does not provide any inherent security measures, making it vulnerable to attacks such as man-in-the-middle attacks, where the content can be tampered with during transmission. IPFS, however, uses cryptography to secure content and each piece of content is verified using its unique hash, ensuring authenticity. Additionally, IPFS content is stored on a decentralized network, making it more difficult for attackers to manipulate the content.

HTTP transfers data from a centralized server to a client. This can result in slow data transfer speeds, especially for large files, as the client has to wait for the entire file to be downloaded before it can be used. IPFS, on the other hand, transfers data in a distributed manner, meaning that the file can be downloaded from multiple nodes simultaneously, resulting in faster download speeds.

IPFS has several advantages over traditional HTTP and centralized systems, including its decentralized nature. With IPFS, files are not stored in one central location, but rather are distributed across a network of nodes, making it more resilient to failure and censorship. This decentralization also means that no single entity has control over the network, making it more democratic and trustworthy.

Another advantage of IPFS is faster data transfer. Since IPFS stores files as content-addressed blocks, rather than location-based addresses, it can retrieve files faster than traditional HTTP, which relies on location-based addressing. This means that users can access files more quickly, and the network can handle more traffic with less latency.

IPFS also has lower bandwidth requirements compared to traditional HTTP. This is because IPFS only needs to transfer changes to files, rather than entire files, making it more efficient in terms of data usage. Additionally, IPFS uses peer-to-peer networking, which reduces the need for centralized servers and lowers overall bandwidth requirements.

Finally, IPFS offers data permanence and redundancy, which means that files uploaded to the network will be stored and accessible indefinitely. IPFS achieves this through its decentralized storage system, which ensures that files are replicated across multiple nodes in the network, making it highly resilient to data loss. This also means that users can access their files even if one or more nodes in the network fail.

With IPFS, users can share files without relying on centralized servers. This means that there is no single point of failure, and files can be accessed and shared even if the original uploader goes offline. This use case is particularly useful for content that needs to be available even if the original source is no longer accessible.

Social media platforms built on IPFS can offer users more control over their data and privacy. By using IPFS, these platforms can store user data in a decentralized manner, reducing the risk of data breaches and providing greater transparency around how user data is being used.

IPFS can be used as a building block for decentralized applications (dApps). dApps built on IPFS can benefit from its decentralized architecture, as well as its content-addressed system that makes it easier to reference data in a decentralized environment.

IPFS can be used to build decentralized websites that are not reliant on traditional web servers. With IPFS, website data can be distributed across a network of peers, making it more resilient to censorship and DDoS attacks.

One of the biggest challenges for IPFS is adoption and awareness. Despite its potential benefits, many people are not yet familiar with IPFS or have not yet seen a strong use case for it. This lack of awareness and adoption can slow down the development of the IPFS ecosystem.

IPFS is a relatively new technology, and there is still a lack of standardization in the space. This means that there are different implementations of the protocol, which can lead to interoperability issues and confusion for developers and users.

IPFS is a complex system that relies on a number of different components, and there are still technical challenges to overcome. For example, IPFS requires a lot of storage space, which can be expensive, and there are still issues with file transfers and data replication.

Another challenge for IPFS is compatibility with existing infrastructure. Many existing systems and applications are not designed to work with IPFS, which can make it difficult to integrate IPFS into existing workflows and processes.

Brave is a privacy-focused web browser that uses IPFS to enable users to access decentralized versions of websites. The Brave browser includes a built-in IPFS gateway, which allows users to access content hosted on the IPFS network without leaving the browser. This provides users with greater privacy and security, as their browsing data is not stored on centralized servers.

Filecoin is a decentralized storage network that uses IPFS to organize and retrieve data. Filecoin incentivizes users to store and share data by providing rewards in the form of its native cryptocurrency, FIL. The project has gained traction in the world of decentralized finance (DeFi), with various protocols integrating Filecoin storage solutions to enhance their data integrity.

Arweave is a decentralized storage platform that uses IPFS to provide permanent, tamper-proof storage for data and applications. Arweave is designed to provide a long-term, low-cost alternative to traditional cloud storage platforms.

Pinata is a decentralized cloud storage platform that utilizes IPFS to store and distribute files. With Pinata, users can store their files on the IPFS network, ensuring that their data is permanently available and can be accessed from anywhere in the world.

DTube is a decentralized video platform that uses IPFS to store and distribute content. Unlike traditional video platforms, DTube does not rely on centralized servers to host and distribute videos. Instead, all content is stored and shared on the IPFS network. This provides users with greater privacy and security, as their videos are not subject to censorship or removal by centralized authorities.

IPFS has the potential to revolutionize the way we store, share, and access information on the internet. Its decentralized nature, faster data transfer, lower bandwidth requirements, and data permanence and redundancy make it a promising alternative to HTTP. Real-world examples such as Brave Browser, Filecoin, Arweave, Pinata and DTube demonstrate the practical applications of IPFS in various industries.

As we move towards a more decentralized and Web3 future, the adoption of IPFS is crucial. Although there are challenges such as adoption and awareness, lack of standardization, technical challenges, and compatibility with existing infrastructure, the potential benefits of IPFS outweigh the challenges.

It is essential for developers and businesses to explore the potential of IPFS and its possibilities for their projects. As we continue to innovate and push the boundaries of what is possible, IPFS presents an exciting opportunity to build a more open, secure, and decentralized internet for everyone.

Read the original here:
IPFS Review: How Is Decentralized Data Sharing Better than the ... - Cryptopolitan

How to back up your photos while traveling – TechRadar

Holiday shots or location shoot, your treasured travel snaps are worth protecting. Anything can happen when youre roaming, which is why it pays to take precautions. After all, travel insurance might cover your camera and gear, but it cant replace the memories stored on a missing SD card.

Weve outlined a range of backup options in our guide below, with several solutions to suit different types of travel photographer. Which approach suits you best will depend on how youre traveling and whats in your kit bag: if youve packed a laptop, for example, you might find it easiest to copy files from your camera and upload them to one of the best photo cloud storage services. If not, multiple SD cards might work better.

Whichever method or methods you choose, keeping your photos backed up while traveling requires some effort. To ensure the security of your snaps, youll need to regularly back up the images kept on your camera. This is especially true if youre on a longer trip, where you can quickly amass hundreds of photos on your memory card. But its worth every second for the peace of mind that comes from knowing your precious pics are protected.

One of the easiest ways to back up your photos while traveling is to use multiple memory cards. While many of the best SD cards offer high storage capacities at affordable prices, its often sensible to travel with several smaller memory cards. By swapping these cards regularly, you can limit how many precious images are kept in your camera at any one time. So, should the worst happen and your camera somehow get lost or stolen, you wont lose your entire library of travel photos.

To protect your backup cards, its best to invest in a foam-lined carry case. Youll find plenty of options online, ranging from soft-shell wallets to rugged, hard-shell cases. For extra security, try to store this separately from your camera bag.

Many of the best travel cameras have multiple memory card slots. If your camera has this option, its a good idea to configure one slot to act as a backup for the other. While its rare for memory cards to fail, this will give you the added reassurance that all of your shots are being saved in duplicate. It also means you can recover accidentally deleted snaps from the backup card if you need to.

Today's best SanDisk Extreme PRO 32 GB 300MB/s UHS-II Class 10 U3 deals

If youre traveling with your laptop, the easiest way to back up your photos is to make a copy of your image files onto its hard drive. Best practice when it comes to backing up is to keep multiple copies so while youre still on the road, dont delete the originals from your SD card(s) unless you have to.

If your laptop has a built-in memory card reader, backing up is as easy as inserting the card and copying the files over to your laptops hard drive. If it doesnt, you can easily pick up an affordable card reader online. There are simple options for SD cards alone, as well as plug-in readers for a range of card types, including CFexpress. Youll also find options with USB-A and USB-C connectivity, depending on your device.

If youre not carrying a laptop but do have a tablet in your backpack, you might be able to use that to back up your photos while traveling. You can find SD card adapters that work with most of the best tablets, including Lightning and USB-C card readers. Connect these to your tablet and you should be able to download photos to your devices built-in storage.

If your laptop doesnt have a built-in card reader or the right peripheral, youll likely be able to link it to your camera using a USB cable and proprietary software. But if you dont have the correct cable or simply prefer to work wirelessly you might be able to back up photos using your cameras wireless connectivity instead. Most modern camera models, including many of those in our round-up of the best cameras, feature built-in Wi-Fi, which you can use to connect directly to your laptop, tablet or smartphone.

The process of transferring photos wirelessly varies by make and model. Youll often need to download a specific app or software in order to access your cameras storage and copy files across from it. Depending on your camera, the connection process can also require a deep-dive into its settings menus to enable wireless connectivity and pair it with your device.

While the initial setup can involve several steps, once youve established the connection, it should be easier to repeat the process later in your travels. You can then use this wireless pairing to periodically back up your travel snaps by downloading copies of your photos to your devices built-in storage.

As above, the more places in which you store copies of your photo files while traveling, the safer they are. One of the easiest ways to create a backup of your travel snaps is to keep copies on flash drives or an external hard drive.

USB drives today offer rapid transfer speeds and a wide range of capacities, with some of the best flash drives weighing in at 1TB. That means you can stash a whole cache of travel snaps on a single, lightweight drive. The downside is that such a small drive is easy to misplace. If youre serious about going the extra mile to protect your photos, you might consider using multiple flash drives to divide or duplicate your library.

The alternative is an external hard drive. These are small enough to easily fit in a backpack, yet offer sufficient storage space to comfortably accommodate an entire trips worth of travel snaps. If your travel plans are more adventurous, youll also find several rugged hard drive options in our round-up of the best external hard drives, which promise to protect your files against dust, drops and water.

If youre traveling with a laptop, backing up photos to an external drive is as easy as connecting your camera (or its memory card), inserting the drive and copying the files. If you dont have a laptop, youll need to be more imaginative. One option is to seek out a hostel or co-working space with computers you can use to execute the file transfer.

If you cant find access to a PC, all is not lost. Certain specialist external hard drives feature built-in SD card readers for backing up images directly. One example is the now-discontinued LaCie Rugged RAID Pro (opens in new tab), which you can still find online if you look hard enough.

Today's best Seagate Backup Plus Portable Hard Drive deals

If youre willing to invest in an all-in-one solution to streamline your photo backups while traveling, we recommend looking at a wireless drive. More than just a place to store your travel snaps, these work as a complete hub for your files, offering many of the benefits of the options outlined above.

One of the best options is Western Digitals My Passport Wireless SSD. Available in several capacities, this shock-resistant external hard drive works as a standalone storage system, with a built-in battery that runs for up to 10 hours. A built-in SD card reader means you can copy photo files directly to the drive. If your camera uses a different type of card, you can also connect a compatible card reader to its USB 2.0 port.

Once your files are copied to the drive, you can access them using your smartphone or tablet via the drives own Wi-Fi connection. This way, you can create additional backups of your travel photos by saving the photos to your device and, to go a step further, uploading them to the cloud (see below). Western Digitals isnt the only wireless drive, but it is one of the few with an SD card reader.

Today's best Western Digital WD My Passport Wireless SSD deals

While physical photo backups are the easiest to create while traveling, a cloud backup service is objectively the most secure. Once copies of your images are uploaded to the cloud, they cant be lost or stolen on your journey. The only drawback? Uploading to the cloud requires a strong Wi-Fi or data connection, and the file sizes associated with photos mean the process can take a while.

Weve compiled a comprehensive round-up of the best cloud storage for photos options for every type of photographer, which is designed to help you decide which one best suits your needs. Most of these offer apps for both desktop and mobile devices, including Google Photos, iCloud and Dropbox which means youll be able to access the cloud even if you leave your laptop at home.

Whats important to consider when traveling is how youll actually upload your photos. Assuming you have some kind of data connection, you still need to get the photo files from your camera to the cloud. To date, GoPro is one of the only manufacturers to offer direct camera-to-cloud backups: if you have a GoPro Subscription and one of the best GoPro cameras, it will automatically save your photos and video clips to the cloud when your camera is charging and connected to Wi-Fi.

For most other cameras, youll need to use one of the methods outlined above to first transfer your photos to the library on your laptop, tablet or smartphone, then upload them to the cloud from there. Canons smartphone app streamlines this process, allowing Google One members to back up photos from some of the best Canon cameras directly to Google Photos.

When it comes to saving your travel snaps, the best method for you will come down to what gear youre travelling with, the type of trip youre taking, and how securely you want to back up your photos. Each of the methods above has its merits, but the key point to remember is that they are not mutually exclusive: the most important rule when backing up travel photos is that more backups are better. Where possible, you should always create several copies of your images on different forms of storage.

While this might sound like overkill when youre on the road, the idea is that its better to have an unnecessary failsafe than to end up losing your holiday photos because something went wrong with your only backup. For example, say you keep your camera and your backup hard drive in the same backpack while traveling. If that backpack goes missing, youve still lost all of your precious photos, even though you backed them up.

Thats why its better to have several photo storage solutions, and to keep them separate. The cloud is a fantastic option, because you dont need to carry it with you but depending on your trip, you might not have ready access to a laptop or a Wi-Fi connection. In this case, do what you can to create multiple backups. For example, if you copy files to your laptop and to a flash drive, consider also keeping the originals on memory cards at least until youre able to copy them to the cloud. In short, the best way to back up photos while traveling is to play it safe, not sorry.

Read more:
How to back up your photos while traveling - TechRadar

Climate change: The push to reduce IT’s carbon footprint – InfoWorld

Humans are facing an existential crisis in climate change. We are also facing a crisis of collective action. As a species, we have every reason to slow the rise of global temperatures, but taking steps to cut carbon emissions is generally not in the short-term interest of individuals, companies, or countries. Where does that leave IT organizations?

IT systems all around the world consume ever-increasing amounts of electric power, making them a critical factor in increasing carbon emissions. Many people in the industry are acutely aware of IT's climate impact and want to see it reduced, but minimizing IT's carbon footprint will entail a cost that many small businesses and multinational corporations are reluctant to bear.

Curious about what might incentivize a shift to greener tech, I spoke to IT leaders who are pushing back on climate change. I found people working at every level of organizational leadershipfrom the top down to the bottom upand pursuing a variety of strategies to reduce carbon consumption in company products and business models.

When asked what drives IT's carbon emissions, most respondents pointed to data. In particular, the rising popularity of data lakes and the data centers that store them are a huge contributor to the problem. Given the primacy of data for modern businesses, companies that want to reduce their carbon footprint will have to make hard choices.

"Companies would have to stop collecting a lot of (poor) data and storing it," says Chrissy Kidd, SEO manager at Splunk, which helps users sort through massive machine-generated data sets. "They won't do this because they're married to the idea that they are 'data driven' organizations, when most of them are not. We're also living in a data ecosystem, where everything is based on collecting and storing data, even when only an estimated 10% of that data gets 'used' beyond simple storage. Until we have less data to store, seemingly forever, IT companies will continue to emit more carbon than not," she said.

The explosion of storage and its emissions in recent years was driven not only by data's usefulness (real or perceived), but by a fundamental shift in underlying economic factors. "In older models, storage was one of the most expensive components of a system, so we were very selective in what data was stored and how," says George Burns III. A senior consultant for cloud operations at SPR, a technology modernization firm, Burns notes that today, "the opposite is true, in that storage is often the least expensive component of a system, which has led many organizations to adopt a 'store everything forever' mentality."

The most straightforward way to reduce data center emissions is to power those data centers with clean energy. This can turn out to be a quick win for companies looking to burnish their green credentials. As the cost of renewables continues to drop, it is also becoming a relatively inexpensive fix. "Customers of corporate colocation data centers are increasingly seeking more sustainable energy supplies, which thanks to recent progress they will be able to access more and more," says Chris Pennington, director of energy and sustainability at Iron Mountain. "Operators in our industry, Iron Mountain amongst them, have proven that renewables are a reliable and cost-effective energy source by activating innovative procurement solutions, and it is making clean energy more accessible to all."

A slew of companies are now trying to solve the data problem with datathat is, by using data analytics and other IT techniques to reduce the amount of stored data. For instance, Moogsoft, the developer of an AIOps incident management platform, uses machine learning algorithms to try to reduce the amount of data at rest and in motion on customer infrastructure. While this functionality has always been part of Moogsoft's pitch, company CTO Richard Whitehead says he's seen customers' motivations change in recent years.

"We're definitely seeing the shift going from 'I want to use fewer resources' to 'We need to be more environmentally conscious about the resources that we use,'" he explains. "We had a roadmap briefing with one of our very large customers in the energy business. And they said right off that they were intending to be carbon neutral within a certain number of years, which I thought was a fascinating and very high-level way to start a conversation with an IT vendor."

So what's behind that push? Whitehead says that "when people decide to do the right thing, it's because they have to because of legislation, because it makes sense because of economics, or because of brand." The economics argument is the most straightforward: achieving the same goal while using less electricity is good for the Earth and for the bottom line. Regulations are also a familiar driver, though climate regulations are tougher in some places than others. As for branding, he says, "we've definitely seen larger organizations, particularly in retail, whose brand focuses on, doing the right thing and being green. For those organizations to have a high carbon footprint is not a good look."

Services are emerging to deliver the data that can help align organizations' operations with these goals. One example is TrueCarbon, which uses AI to analyze data center usage and identify patterns that help customers understand, control, and reduce cloud spending and carbon emissions. Kelly Fleming, CIO of Cirrus Nexus, the company that makes TrueCarbon, says that in their customer base, "some are looking to control spiraling costs without compromising their IT infrastructures performance, while others want to understand and reduce their organizations carbon footprint." He doesn't see these two motivations being necessarily in conflict. "TrueCarbon seeks to marry these two objectives," he says. "By giving companies the ability to apply a cost to their carbon emissions through the platform, business decisions on how and where to deploy workloads based on cost inherently start to incorporate carbon emissions considerations."

Cloud storage provider Wasabi is looking to cater to customers who want to reduce their carbon footprint with a carbon footprint calculator. The tool estimates how much energy they will use based on the Wasabi data center they are storing in and how much data they store. "Over the past few years, Wasabi channel partners and customers have grown increasingly interested in decarbonization and sustainability as part of their broader environmental, social, and governmental objectives," says David Boland, Wasabi's VP of cloud strategy. "Customers are increasingly focused on sustainability for several reasons, including increased environment, social, and governance (ESG) reporting requirements, and internal sustainability efforts. Many are preparing for more stringent requirements or pursuing customers, investors, and employees who are attracted to ESG-centric products, services, and employers."

Greenly is a company that aims to go beyond the data center, offering a software-as-a-service (SaaS) platform for carbon accounting and carbon management. "In practice, businesses start their carbon accounting because it's requested by clients, in the course of an RFP, or simply as a core requirement to work with a large account," says Greenly CEO Alexis Normand. "In the US or UK, for instance, being a supplier to the administration now means having a plan to reduce your emissions. Regulations are usually not the prime driver. What we have learned working with nearly 1,000 customers is that most companies start their climate journey when they see that it's essential to thrive as a business, either to attract the right kind of customers or the right kind of employees."

Stacy Smedley was once in a position not unlike customers of companies like Greenly or Wasabi: she wanted to better understand the carbon footprint of her work. She wasn't in ITshe worked at the construction giant Skanskabut she soon found herself as the protagonist in a typical tech industry story: Unable to find the tech tool she wanted, she assembled a team and built one.

Smedley is now the executive director of Building Transparency, which offers a searchable, sortable, fully digital, and standardized database of global Environmental Product Declarations, or EPDs. These standardized documents are critical to understanding the amount of embodied carbonthat is, the amount of CO2 it takes to create somethingin just about any product you can name, from concrete to spaghetti. Smedley believes these documents contain data that is key to reducing carbon across the lifecycles of whole industries. Just the existence of that data could be transformative.

"First there was a credit in the LEED rating system that gave you points for just getting EPDs for your products that you were installing," she explains. "So the market here really started with manufacturers wanting to be able to say, 'Yes, I've got an EPD for my product, so you should use me in your building.' But if you're disclosing the amount of CO2 in every cubic yard of concrete that you make or square foot of carpet tile, it's innate that people are going start using that for more than just transparency, which is what I did. I was looking at it more through the lens of comparison and really prioritizing the lower carbon option."

Building Transparency's trajectory follows another pattern that should be familiar to those in IT: What began as an in-house project at Skanska was spun off as a nonprofit. Funding from Microsoft, a Skanska client, supported the nonprofit in offering free and open access to its database of EPDs. (Users who want early access to new features or local database instances can choose a paid account.)

Company leaders are using Building Transparency to meet their ESG goals or regulatory mandates. But Smedley sees this kind of open data access being used to enable bottom-up environmental decisions. "I think we have a whole tier of users that are the juniors at their architecture companies or at the places where they work who love the fact that they can go in for free and access this stuff and do what they can without asking for permission," she says. "That's big, and that's important to me."

With transparent access to carbon emissions data, people at all levels in all industries can take action to fight climate change. Greenly is one company that believes its products will help customers improve employee satisfaction, and it is not alone.

"As a small software company, we're also interested in people and incentivizing our employees," says Moogsoft's Whitehead. "And, it's pretty obvious that if you ask any one of our employees and say, you've got a choice: if you write this piece of code, the world's going to become a better place, or if you write that piece of code, somebody's going to go burn a tire somewhere. They're going to choose the first one. Everybody is happy about doing the right thing."

Here is the original post:
Climate change: The push to reduce IT's carbon footprint - InfoWorld

Best Outdoor Smart Cameras of 2023: 9 Best Security Picks For … – Home Theater Review

1. Performance

When comparing outdoor smart security cameras, it is important to consider the image resolution, night vision, alerts, storage, audio, smart-home integration, and power. Most cameras stream and record 1080p or 2K video with less clarity than a smartphone. Night vision should be a feature and some even provide color night vision for extra detail. Alerts should be fast, so the network should be reliable and not have data caps. Storage can be cloud-based, on a microSD card, or on a connected hard drive. Audio is important too, so look for a camera with a built-in mic and speaker. Smart-home integration is offered with Alexa, HomeKit, Google Assistant, IFTTT, and SmartThings. Lastly, consider the power needs, as battery-powered cameras need to be recharged and some require AC power.

When shopping for an outdoor smart security camera, it's important to consider several important features. Motion detection allows you to customize sensitivity and privacy zones, while two-way talk allows you to use your own "outside voice" to spook potential intruders. Full-color night vision technology makes it easier to see at night, and high-resolution video (up to 1080p or even higher) makes for smooth, reliable footage with lots of detail. You may also want to consider video recording and cloud storage capabilities, a built-in siren, live view capability, and the ability to withstand any weather condition.

When it comes to price considerations when buying an outdoor smart camera, there is a wide range of options available. The most expensive device listed was the Nest Cam with Floodlight, at $279.99, while the most affordable option was the Blink Outdoor at just under $100. Generally speaking, the more expensive cameras offered more features, but for those on a tighter budget, it was possible to get a decent camera for significantly less money.

When choosing an outdoor security camera, it is important to consider the features you need, as well as the price. Higher-quality cameras may cost more, but they can also provide more features and better performance. Whatever option you choose, you should also factor in the cost of any additional storage plans that may be required.

When shopping for an outdoor smart security camera, there are certain qualities to consider. Motion detection, two-way talk, night vision, high resolution, video recording, cloud storage, and a built-in siren are all features that can help enhance the security of your home. Motion detection allows you to customize settings to fit your needs, from adjusting sensitivity to mapping out privacy zones. Two-way talk is great for spooking potential burglars who think youre home, even when youre not. Night vision has been made easier with the advent of full-color night vision. High resolution, such as 1080p, is essential for getting sharp, detailed footage. Video recording, both locally and in the cloud, is also important. Finally, a built-in siren is an effective active deterrence tool against criminals.

The range of an outdoor smart security camera refers to how far the camera can see concerning its field of view. Generally, the range of an outdoor camera is anywhere from 30 to 40 feet away, depending on lighting conditions. At this range, it is possible to detect movement but faces and license plates may become blurry. If a longer range is desired, then it may be necessary to look into cameras that are connected to a monitoring service. Additionally, outdoor Wi-Fi cameras are typically capable of night vision and offer 1080p or 2K video resolution, a 130 field of view, and 6-night vision LEDs.

Night vision is a must-have when it comes to outdoor security cameras. With modern cameras, you can adjust sensitivity, map out privacy zones, and monitor your property in full color even during the night. This feature is perfect for those who travel often and need to ensure their property is safe. Night vision also acts as an effective deterrent for burglars and other unwanted visitors. Additionally, high-resolution video capability ensures clear and sharp footage so that you don't miss any key details. Night vision is an essential factor to consider when looking for an outdoor smart security camera.

Motion detection is an ideal feature for outdoor smart security cameras, allowing them to detect any activity in the monitored area. Motion sensors can detect changes in the environment, such as movement, and can trigger the camera to record video. Furthermore, many motion sensors can send a push alert to your phone or email when motion is detected, giving you an extra layer of security. Motion detection can also activate camera features such as floodlights, which can light up driveways and other parts of the property in the event of activity. These features combined make motion detection essential for modern-day outdoor security cameras.

The use of cloud storage is a major factor when choosing an outdoor smart security camera. Cloud storage is much more secure than storing video on the camera itself, as it is encrypted and stored away from local sources, which can be vulnerable. With cloud storage, users can access footage from anywhere, and rest assured that their footage is safe and secure. Additionally, cloud storage eliminates the need for extra storage devices such as a hard drive or microSD card, and provides a much more reliable and efficient way of storing data.

Read this article:
Best Outdoor Smart Cameras of 2023: 9 Best Security Picks For ... - Home Theater Review