Page 11234..1020..»

Its almost impossible for Bitcoin to be supplanted by an altcoin; heres why – CryptoSlate

For over a decade, Bitcoin has been the king of the crypto industry; more accurately, its the so-called grandfather of cryptocurrencies, the first that spawned the rest.

Over the majority of this decade, the assets primacy hasnt been threatened. But, ever since 2017s bull run, there have been some claiming theyve created a better Bitcoin or a blockchain to beat all the rest. I wont mention any examples, but theyre easy to come by just take a look at Twitter or Reddit.

According to a prominent investor, however, it is nigh impossible for Bitcoin to be supplanted by another cryptocurrency project. And according to him, the logic behind this argument is rather simple.

Over the past few years, Bitcoin has been branded many things by its skeptics the Myspace of cryptocurrencies and a first-generation blockchain are amongst the many names attempting to discredit the innovation of the system.

This was epitomized on Twitter when an individual commented that he struggles to get behind Bitcoin as an investment and technology because he sees it as the Netscape Navigator of crypto or a technology that is great first but will ultimately be superseded.

PlanB a pseudonymous though respected institutional investor dabbling in Bitcoin was quick to rebut this comment, writing (emphasis ours):

If you see bitcoin as a protocol (like tcp/ip, pop/imap, http etc) instead of a product or company, it will become clear that the next bitcoin is highly unlikely if not impossible. Network effects are important: you need developers, miners, exchanges, investors/liquidity etc.

Indeed, with the introduction of the Liquid Network, Lightning Network, and other solutions, innovators are seeking to build all applications and functionality Bitcoin competitors have on BTC. Bitcoin is being seen as the base layer of a digital economy: the backbone of the future system.

Furthermore, it goes without saying that Bitcoin has the biggest network out of all cryptocurrencies, with the most active community and consistently-growing usage. For it to be passed would require a lot of work.

Dan Morehead and Joey Krug of blockchain-centric fund Pantera Capital echoed this, writing that even in the short term, theres a high probability that Bitcoin outperforms a majority of altcoins amid the ongoing coronavirus crisis.

In Crypto In This Crisis: Pantera Blockchain Letter, March 2020, the investor explained that BTC will probably out-perform other tokens for a while, explaining that it is one of the crypto projects that are seriously entrenched and doesnt rely on inconsistent external funding per se:

Its a project thats already built, it works, it has an 11-year track record. Many newer blockchain and smart contract projects are still in development and might be stressed to raise funding to complete their development.

They further explained that theres typically a flight-to-quality or flight to safety where people want to put money in the mega-caps, the safest asset, the Treasuries of the industry. In the case of crypto assets, Bitcoin is a Treasury bond, as it is much more liquid than the rest.

In the same letter, Pantera predicted that BTC could surmount its $20,000 all-time high within the next 12 months, citing the monetary and fiscal trends that are transpiring.

See the original post:
Its almost impossible for Bitcoin to be supplanted by an altcoin; heres why - CryptoSlate

Read More..

These Five Altcoins Have Emerged As Most Preferred Alternative Crypto Assets – ZyCrypto

A recent survey conducted by cryptocurrency exchange Kraken has unearthed some interesting insights about cryptocurrencies and the respondents views about their future. The survey findings indicated that besides Bitcoin, the respondents had a positive outlook on some major altcoins. Accordingly, Ethereum, Monero, XRP, Litecoin, and Tezos were the five most preferred altcoins in that order.

The data gathered by the Kraken Intelligence team has some strong bearing on the outlook of altcoins as a majority of the respondents believe that altcoins are due to surge soon. This result is quite significant as the study collected views from 400 respondents that included individual traders, institutions, investors, payment firms, crypto exchanges, and miners.

The strong agreement in opinion among such a diverse set of respondents shows that the general market sentiment is positive towards altcoins. This can be interpreted as the general crypto community having great expectations that the alternative cryptocurrencies will emerge from their bear market that has lasted for close to two years now.

The study found that Ethereum was the most preferred altcoin, which is unsurprising since it is the second-largest cryptocurrency by market valuation, only behind Bitcoin. What was a little surprising is Monero being the second favourite altcoin, ahead of XRP. Monero is ranked 14th in market cap terms, quite far behind XRP that is ranked third.

One potential explanation for this huge difference between preference and market cap rank is Moneros strong privacy features. Monero users can send funds securely and anonymously without traceability. On the other hand, XRP real use case is facilitating cross-bank settlements thus lacking much use for the ordinary person other than speculation.

This observation is backed by the research findings that indicate a preference for Monero was that it was ASIC resistant, anonymity and the developer teams proficiency and competence. On the other hand, reasons for XRP preference were high beta to BTC, secure and institutional adoption and interest.

Those that preferred Litecoin cited its community, market cap, early mover and merchant support as the main reasons for their support of the faster version of Bitcoin.

Preference for Tezos, which emerged fifth was its ease of staking, STO play, ETH hedge and because its upgradable.

The insights provided by Kraken customers about these altcoins generally paint a positive picture and huge expectations this year.

Get Daily Crypto News On Facebook | Twitter | Telegram | Instagram

The views expressed in the article are wholly those of the author and do not represent those of, nor should they be attributed to, ZyCrypto.This article is not meant to give financial advice. Please carry out your own research before investing in any of the various cryptocurrencies available.

Link:
These Five Altcoins Have Emerged As Most Preferred Alternative Crypto Assets - ZyCrypto

Read More..

Dash, NEM and Bitcoin Cash price: Altcoins recover post breakdown – AMBCrypto

The March 12 Bitcoin plummet was a collectively woeful period for altcoins as well. With the premier cryptocurrency holding over 60 percent of the total coin market, altcoins bleed with Bitcoin, and now are seeing a resurgence.

At press time, Bitcoin Cash, NEM and Dash are vying for a breakout of their respective individual wedges, with some buying pressure acting as a foundation. However, if a move-up were to be charted, the buys would need to come in thick and fast.

Bitcoin Cash

The medium of exchange to Bitcoins store of value, and fifth-largest cryptocurrency in the market has been trading upward since March 16, to March 20, but since then hit a snag. Bitcoin Cash after reaching resistance of $242, bounced off and began moving down. The altcoin, since the high, has lost 11 percent of its value, dropping to $213 in the interim.

Looking below, immediate support lies at $201 above which is the base of the wedge, which also bears significance. In the longer-term, the $164 level should be looked at with caution as BCH dipped below and rose above it on two occasions between March 12 to March 16.

Indicators-wise, MACD or the moving average convergence divergence has seen a movement of the MACD line over the Signal line and if this pressure continues, it should move over 0, indicating more buying pressure.

Nem

Nem [XEM] also faces a similar predicament of moving within the wedge, away from its base. The wedge in question has been forming since before the March 12 drop, which shaved over a fourth of the altcoins value.

However, the altcoin has mounted a strong recovery from March 16. This recovery and subsequent sideways trading has resulted in the formation of a short-term support level at $0.0355, which at press time price was only marginally trading above by a mere 5.71 percent. Long-term support looms at $0.0301, formed earlier in the month.

Relative Strength Index [RSI] is healthy at 52, right in the middle of the channel, and has been rising consistently over the past two days, suggesting steady buying pressure.

Dash

Unlike the other two altcoins, Dash [DASH] is trading within a parallel channel, a relatively better sign than the back-and-forth movement for the rest of the market. Since the March 16 recovery, the altcoin has been trading with the range $61.6 to $76.7, where only once did it brush the roof, on March 20, and three occasions after when the floor was almost breached [once on March 23 and twice on March 28].

Going by this parallel movement, the immediate support and resistance points are placed on either ends of the channel in the short-term.

Bollinger Bands suggest that the volatility of the altcoin has dropped, as the bands are now closer together. The coins current price-candlesticks are trading over the average line, indicating a bullish move.

Read more from the original source:
Dash, NEM and Bitcoin Cash price: Altcoins recover post breakdown - AMBCrypto

Read More..

Waves, Augur and Ethereum price: Bearish channel persists as optimism hovers – AMBCrypto

With the collective cryptocurrency market trading sideways, altcoins are moving helter-skelter. Ethereum, Waves and Augur, three diverse altcoins have moved in different directions, one is is yet to recover, another is trending upwards, while the third is trading flat.

Ethereum

The leading altcoin in the market saw increasing decoupling from Bitcoin earlier in the year, but now the push-and-pull with the leading cryptocurrency is back. Ether, posting a market cap of $14 .1 billion, is trading with a narrow increasingly downward sloping wedge, formed as a consequence of the infamous March 12 drop, when Bitcoin lost almost half its value.

Resistance, for the altcoin, lies quite high at $141, while the press time price was $129. Looking down, the altcoin finds support, closer to its market price at $125.4, which is where the lower bottom of the wedge lies.

Bollinger Bands for Ethereum posit an increase in volatility as the bands are moving further apart. Given that in the last two hours the candlesticks have turned red, the average is now intersecting with the price, if it moves above the price, bearish woes lie ahead.

Augur

Augur, unlike its contemporaries, is trading in an upwards channel, with the price rising since the beginning of the previous week. Owing to the March 12 collapse, Augur had lost almost a third of its value, falling to a low of $7.86. Since then, the upwards channel has pushed the coin as high as $9.86, its press time price.

Since hitting the support of $7.64, the altcoin has seen bullish pulls, allowing it to break resistance after resistance, flipping it into support levels. Two such levels can be charted at $8.24 and $9.54, respectively. Despite the fall on 27 March, REP has managed to steer clear of the drop below the latter support, and now is striving to trade within the upward channel.

MACD line for REP has moved below 0 and has dipped below the Signal line indicating bearish pressures. With the upward channels lower-bound trend line close to the price, the coin will face a struggle going forward.

Waves

The Waves platform cryptocurrency, has seen a roller coaster ride since the beginning of 2020, and now is right back to where it started. Since breaking $1 and then $1.5 in February and March, the coin went down with the Bitcoin drop, losing almost 40 percent of its value, and is now trading at $0.815.

Support lines lie at $0.789 and $0.726, while a short term resistance line is present above at $0.962, with the price firmly in between. Since the March 12 drop, the altcoin has seen a visible, albeit weak, upward channel, which is looking to alter given the dropping price since March 27.

RSI for the Waves platform cryptocurrency has been dropping since mid-February, indicating a surging selling pressure, and now is at 40.74 a marginal recovery from 33.9 where it stood on March 14.

Follow this link:
Waves, Augur and Ethereum price: Bearish channel persists as optimism hovers - AMBCrypto

Read More..

Cosmos, DigiByte and Bitcoin SV price: will the alts fight back? – AMBCrypto

With increasing uncertainty surrounding the markets across the globe, cryptocurrencies have seen higher levels of volatility and steeper price drops. Altcoins have not been able to recover their losses after the March 12 crash and most coins continue to struggle. Bitcoin SV [BSV], Cosmos [ATOM] and DigiByte [DGB] have all endured a dip in their price in the past few days.

Bitcoin SV [BSV]

While the early parts of 2020 looked promising for the fork coin BSV, its recent price performance is rather somber. Over the past days, BSV registered a 12.4 percent drop in its price and at press time BSV has a trading value of $155. Bitcoin SV currently has a market cap of $2.8 billion and a 24-hour trading volume of $1.6 billion.

As per the 4-hour chart, there is strong support for BSV at $155 and two points of resistance at $167 and $182. Bollinger Bands are slowly expanding at the moment and imply an increase in volatility. As per the RSI indicator, BSVs price has been in the oversold zone and is now moving away towards the overbought zone.

Cosmos [ATOM]

Earlier in the year, Binance U.S began offering staking rewards for Cosmos however not much has changed regarding the fate of this altcoin. Over the course of the last few days, the price of Cosmos has once again registered a dip of 9 percent bring the price down to $1.91. If the price were to give in to the bearish momentum and fall further the strong support at $1.70, Cosmos can rely on. However, there are also resistances at $2.03 and $2.25.

As per the MACD indicator, the same has endured a bearish crossover with the signal line hovering above the MACD line. The Stochastic indicator is currently at the oversold zone but is heading northbound at press time.

DigiByte [DGB]

DGBs price is at $0.0041 and has a market cap of $52 million. In the past day, DGB has endured a price dip of 13.5 percent and if the price were to fall further, DGB might find support at $0.0029. On the contrary, if the bulls were to raise the price of the coin there are two crucial resistance that would have to be breached, at $0.0042 and $0.0057.

Currently, the RSI indicator is heading towards the oversold zone after having spent a considerable amount of time at the top. MACD indicator echoes a similar sentiment, as it has now undergone a bearish crossover.

Visit link:
Cosmos, DigiByte and Bitcoin SV price: will the alts fight back? - AMBCrypto

Read More..

Ripple, Binance and Gemini Get Exemption In Singapore – Cryptocurrency Regulation – Altcoin Buzz

Ripple Labs, Coinbase, Binance, Gemini and other cryptocurrency firms, have received a permit to operate in the country without obtaining a license by The Monetary Authority of Singapore (MAS). The exemption is applicablefor a limited period of time.

Singapores financial regulator, MAS, has granted license exemption to some cryptocurrency firms that allows them to offer specific digital payment services until July 28, 2020.

At the end of this period, the firms are to apply for the relevant license in order for them to carry out their offers and services.

Apart from CoinBase, Ripple, and Binance, other recognized cryptocurrency firms have been granted exemptions. They include the Singapore entities of AAX Exchange, Cumberland, DRW, LedgerX, GSR, OKCoin, Pundi X, and few others.

The Chief Legal Officer of Pundi X Labs, David Ben Kay, said: In compliance with the PS Act, we will be filing our license application to operate account issuance and digital payment token services by 28 July 2020.

Some other crypto firms were offered a longer period of exemption. A 12-month exemption was granted to BitGo, a subsidiary of one of the biggest Bitcoin payment processors. As well as, to Gemini Trust Company led by the Winklevoss twins.

According to MAS, these firms can offer domestic money transfer, account issuance, and inward cross-border money transfer services in Singapore until January 28, 2021.

The regulators said: Please note that these entities are not licensed under the PS [Payment Services] Act to provide the specific payment services, but are allowed to continue to provide the specific payment services, said the regulator.

Some local crypto companies in Singapore have also praised PSA for serving as a legal instrument that encourages blockchain-related businesses.

At the beginning of 2020, the Payment Services Act, which represents the law regulating payments in Singapore, was established. The Act came into force when the MAS aired its interest concerning cryptocurrencies because of their anonymity, which can lead to laundering of money. Due to this fact, all cryptocurrency firms must possess the relevant license in order to operate in the country.

Besides, under the Act, there are three classes of licenses that should be granted and this includes:

Each service provider needs to hold only one of the three licenses, said MAS.

Read the original:
Ripple, Binance and Gemini Get Exemption In Singapore - Cryptocurrency Regulation - Altcoin Buzz

Read More..

Lost in translation and adrift in cloud storage – The Register

Who, Me? Welcome to a cautionary Who, Me?, a warning to all those lured by the promises of the cloud storage giants and a language lesson for all.

Our story concerns "Dirk", who at the time of our tale was hard at work in a Netherlands IT department. Dirk himself didn't actually speak much Dutch; his first language was English but that was more than enough to get by with in the land of clogs, windmills and dikes.

Having enjoyed a few relatively peaceful months in the job, Dirk told us he was "getting on with tidying up the systems and dealing with the technical debt that had accumulated since the last Big Cleanup a number of years previously."

"One such system," he said, was "cloud-based storage." The vendor was a well-known giant of the industry, but for the purposes of this story, Dirk called it "Poodle"

The task that day was dealing with obsolete accounts, not just from a security standpoint, but also due to licencing costs and, he said ruefully "It seemed like a good idea at the time."

Dirk ploughed through obsolete accounts until he came to one with an odd name: "beheerder". Perhaps an amusing play on "beheader" by some long gone techie? Or something to do with "herding" files? Dirk checked in with his predecessor, who gave the digital equivalent of a shrug. The account had been around since things had been set up back in the day, but nobody had used it in ages.

Indeed, it had been well over a year since anyone had actually logged in using the account. The password was reset and the mailbox scrutinised.

"Perhaps some more notice should have been made of the fact that at one point it was receiving system error messages," sighed Dirk, "but what use is hindsight?"

Anyhow, the messages seemed to have stopped recently and the rest was just junk. To keep things spick and span the account was deleted and, as was standard practice, all files transferred to Dirk.

He thought no more about it until a fortnight later when he decided "to move all those transferred files to the new-fangled Poodle 'Shared Drive'." Heck, everyone else's documents were due to moved there at some point ("with the correct access permission set," he added.)

Dirk clicked "Ignore" on the standard "Files accessed by other users may be affected." This would prove to be a fateful click.

After 20 minutes, one of his colleagues to call to complain that they couldn't access one of their files.

A coincidence, surely. But Dirk began to feel the arse-swooping sensation of dread that something awful might be happening.

Perhaps a little later than he should, Dirk took a closer look at the files being moved. He looked a little closer since some were still in progress. He poked further, down two or three folder levels and saw something that looked familiar.

"Horrifyingly familiar. In fact, it almost looked exactly like the core folder structure that everyone in the office used."

A little more investigation and Dirk realised that he could skip the "almost."

As his stomach sank through the floor, Dirk realised "that over the next few hours file access will disappear for all users."

"It wasn't immediate," he added "because file storage is cloud-based and sometimes it takes its own, undefined, time to do large file moves."

The helpful interface afforded no way to stop the hellish process of borkery. In a panic he called the cloud vendor, observing: "They're rather large and helpful in inversely proportional quantities."

And, of course, they couldn't help. For whatever reason, "there didn't appear to be a way to pause or stop the oncoming slaughter. Everyone was going to lose access to their files."

Having told the increasingly anxious users that there was a "small" problem, Dirk peered at the unfolding carnage he had inadvertently wreaked upon the company's files:

"Some were still in the old location, and some were in the new location."

He tried to move the files again. And again. On the third try he got some response; the system helpfully told him that it was "moving files", but without any kind of countdown of progress bar.

"Empty folders were being created," he told us, "and since the system was still progressing the previous move requests things got very complicated - sometimes files were moving and sometimes not. And sometimes the files were copied before they were moved. On the plus side Poodle Drive would happily create folders before deciding whether or not to copy the files, so that's something isn't it?"

What had happened was that the user "beheerder" had created the root folders back in the day, and other users had created subfolders. A little too late, Dirk discovered that the Move function he had used "only affects folders that you have ownership over. This doesn't include subfolders you didn't create."

"Suddenly," he said "there was a smorgasbord of folders spewn around the system. Some stuff stubbornly in the old location, and some stuff in new locations. Hundreds of gigabytes of data in fact. Smeared everywhere."

The good news was that was a backup, so nothing was actually lost. The only challenge was decide to "Restore" or "Restore with File Permissions." Fearing what would happen to existing files, he restored without those permissions. At least then he'd be able to see all the files and put them back correctly. He could then tidy up the folders and set the correct access levels.

"It took a month," he said.

"Hindsight says that I should have checked out the other option too, but maybe next time(!)"

Our tale ends happily. The users were eventually happy. The storage was tidy. Dirk, while he obviously paid wasn't overtime for his efforts, survived. He also learned to take things a little slower and think a little harder about what those messages were telling him.

Heck, he's even learning Dutch.

Apparently, "beheerder" means "administrator".

Ever been bitten on the behind by cloud storage, or gaily skipped past a message box without fully understanding what it was telling you? Sadly, it is all too common. Share your tale of woe with the sympathetic vultures at Who, Me?

Sponsored: Practical tips for Office 365 tenant-to-tenant migration

Link:
Lost in translation and adrift in cloud storage - The Register

Read More..

How to combat insider threats as organisations increasingly rely on cloud computing to telecommunicate – TechRadar

Cloud providers including Microsoft, Google, and others, have recently acknowledged that they are struggling to deal with a spike in remote tools usage.

As organisations hastily adapt for remote working, they might fail to ensure adequate data security. In particular, cloud usage increases the risk of insider threats as 53% of organisations believe detecting insider attacks is significantly harder in the cloud than on-premises, according toa recent report. Therefore, it has never been as important as it is today for organisations to implement proper measures to mitigate the insider threat to protect data in the cloud.

Firstly, remote employees use cloud applications to exchange data, including sensitive data, and could misplace it in insecure locations which could lead to a compliance violation. For example, sharing sensitive data via Microsoft Teams an increasingly popular application for telecommunication will result in data spreading across SharePoint Online storage with a high risk of unauthorised access. In fact, 39% of the UK respondents to our recent survey are sure that employees in their organisations share sensitive data via cloud applications outside of ITs control.

Secondly, remote employees often work from their personal devices which are not controlled by the corporate IT team, and as such are more prone to data breaches than their corporate PCs. Such devices are often unpatched and, therefore, vulnerable to cyber threats. Once an attacker has a foothold in the employee's device, they have "remote control" and can observe and leverage any outgoing connections from this. Essentially, they can gain access to all corporate cloud services the user connects to or even to the corporate network on-premises as soon as the user establishes their VPN connection or remote desktop (RDP) session to any internal servers.

In addition, an employee might lose his/her device, or let other family members use it, which will result in unauthorised access. In some rare cases, employees copy sensitive data to their personal devices from corporate cloud storage with malicious intent, which also is a serious security risk.

In normal circumstances, before asking employees to work from home, an organisation should ideally develop proper security policies with a specific focus on cloud security. First and foremost, it is critical to ensure that all user permissions to storages with sensitive data are granted on a 'need-to-access basis to prevent insiders from accessing the information they do not need to do their job.

In addition, it is important to establish effective access controls as well as efficient identity verification methods such as multi-factor authentication, which will also protect organisations sensitive data in the cloud from unauthorised access.

And last but not least, it is critical that the IT department trains employees on cloud dos and donts, starting from the principles of dealing with sensitive data and ending with instructions for patching and securing their personal devices. All such measures should be implemented on an ongoing basis, with the IT team being ready to support employees with any issue when they work from home, whether its an operational problem or security issue.

If an organisation does not know where its sensitive data resides in the cloud, it cannot ensure that remote employees are following security policies. This is particularly challenging as modern organisations use multiple clouds.

In fact, McAfee has calculated that an average enterprise uses around 1,427 distinct cloud services, while an average employee actively uses 36 cloud services at work. The more cloud services remote employees use, the more challenging it is for an organisation's IT team to track how they handle data. It means an increased risk of misplacing sensitive data and the bad PR and compliance findings that come with that. To reduce data overexposure, it is critical to have technologies in place to automatically discover sensitive data across multiple cloud storages and classify it according to its sensitivity on a continuous basis.

As the cloud is prone to a broad range of threat vectors for data exfiltration by insiders, it is critically important for an organisation to detect such cases in a timely manner. Is it malware trying to break into the corporate network, or an insider aiming to steal customer database? All these cloud security risks, and many others, are accompanied by anomalies in user activity. Therefore, if an organisation uses cloud computing and cloud storage, it is important to have user behaviour analysis (UBA) technologies in place that can detect deviations from normal user behavior and alert an IT team about potential cloud threats.

Examples of the most common anomalies that indicate a threat include abnormal logon activities (such as attempts to log on from multiple endpoints, multiple subsequent logons in a short period of time, and an unusually high number of logon failures); or data access patterns differing from the user's past behaviour or that of their peers. It is important to note the shift from office work to remote access will probably cause initial changes in users' access patterns. Businesses can expect a higher than normal number of false positives from Machine Learning-based behaviour anomaly detection solutions in the first couple of weeks after users move away from their central offices.

Such measures will help organisations minimise insider threats in the cloud not only during the worlds largest work-from-home experiment, asTime has dubbed the COVID-19 outbreak, but also when it comes to an end. With the subsequent economic recession that is likely to follow, cloud computing will remain a cost-effective way to run a business. A sustainable approach to cloud security will enable organisations to avoid unwanted data breaches and hefty compliance fines in the long run.

Matt Middleton-Leal is EMEA & APAC General Manager at Netwrix

See the article here:
How to combat insider threats as organisations increasingly rely on cloud computing to telecommunicate - TechRadar

Read More..

What the breakup of data management and storage means to you – TechTarget

In this era of digital transformation, organizations of all sorts are becoming data-centric information managers. Technologies such as AI, IoT, 5G and edge computing are creating data at unprecedented rates. That data is being used to deliver more insights, services, customized services, products and innovation. And stricter privacy laws and regulations on personally identifiable information with harsh financial penalties for noncompliance are complicating the situation.

For organizations that want to derive value from data that's greater than the cost of storing and using it, effective data management and storage is becoming more important than ever. The abstraction of data management from storage systems to run on its own is one approach to better data management.

Data management has several meanings depending on the vendor. It has been defined as ingesting, storing, organizing and maintaining the data an organization creates. But that definition is outdated. It's adequate as data management for legacy storage systems; however, even that falls short for modern storage systems.

Data management today means considerably more, including:

That's quite a bit for a data management system to do -- and do well. Remember, the most important responsibilities of a data management and storage system are ingesting, storing, organizing and maintaining the data. All those other data management capabilities are resource-intensive and negatively impact the system's primary responsibilities.

These abstracted data management systems have an outsized positive impact on IT organizations.

And most storage systems don't generally work well with other storage systems. That's not news to storage admins. And many systems have problems working with cloud storage, too. Few -- and that's being generous -- actively work with tape systems.

Multivendor heterogeneous storage is a bigger problem. Storage vendors rarely work seamlessly with one another, which is why storage system-centric data management tends to focus on a single vendor. This approach bypasses the multivendor problem while locking in users to that specific vendor's data management and storage products.

Another data management storage issue is the complicated data management licensing structure. There's the data management software licensing, of course, but it doesn't stop there. There are typically other licensing fees, such as a capacity license fee on the data the storage system moves to cloud storage or other storage systems. Then, there's the cloud storage capacity license fee and potentially egress fees for data access. In addition, when users and applications access moved data, it often must be rehydrated back to the originating storage system. Data movement takes time adding substantial latency to each access request. It makes more sense to access the data where it resides.

One storage-centric approach to this problem is to put all an organization's data in a single scale-out, all-encompassing storage system, historically referred to as a "god box." This system would have all the storage performance and cost tiers, data protection, archiving and so on, along with all data management.

Even if that storage system could meet every performance requirement, scale every tier to meet hundreds of petabytes or exabytes of data, and do everything data management needs to do today, there are other intractable problems. The data management software would still be a heavy draw on the storage controllers negatively affecting performance. More importantly, data still must be moved or migrated from where it currently is to this system. And it fails to solve multiorganizational data sharing problems.

These issues have led to a new approach where data management is abstracted from storage systems. The data management software runs on its own server hardware. It sits out of band, in band or a combination of the two.

Abstracted data management captures the data and metadata in one of three ways. It can mount all storage systems with administrative privileges; Dell EMC ClarityNow, Hammerspace, iRODS (open source), Komprise, Spectra Logic StorCycle, Starfish Storage and StrongBox Data Solutions StrongLink do this. Or it can sit in front of the storage looking and acting like a high-speed network switch -- think InfiniteIO here. And with the third approach, it uses clients, or agents, like Aparavi does. Most of these systems have some level of AI or machine learning built into the software that optimizes operations. Each methodology and vendor have their own pros and cons that will be covered in a future article.

These abstracted data management systems have an outsized positive impact on IT organizations. They commoditize the storage system, reducing both the amount and the cost of storage for each tier. They do that by right-sizing the data to the proper tier and eliminating vendor lock-in.

Storage system compatibility is no longer an issue. Storage systems are merely the containers in which the data resides based on their locality, performance and cost characteristics. These abstracted data management systems also simplify operations in several ways:

Each of these data management systems scales differently. Some are designed to scale to hundreds of petabytes and exabytes. Others scale from terabytes to dozens of petabytes. It depends on their architecture, and most are, by definition, storage-agnostic.

Vendors have different licensing requirements. Some license by terabyte of capacity managed. Others vary that capacity licensing by hot and cold data. Still others license according to the number of servers and server cores required to run their software at the performance level an organization requires.

What does external abstracted data management mean? It means IT organizations can choose storage based on cost and performance, not data management functions. It means storage vendors will no longer have an incumbent advantage. It means simplified IT operations and lower costs. When it comes to data management and storage, all that bodes well for the future.

View post:
What the breakup of data management and storage means to you - TechTarget

Read More..

How to combat insider threats as organizations increasingly rely on cloud computing for telecommunications – NewsDio

Cloud providers, including Microsoft, Google, and others, have recently acknowledged that they are struggling to cope with an increase in the use of remote tools.

As organizations hastily adapt to remote work, they may not ensure adequate data security. In particular, using the cloud increases the risk of internal threats, as 53% of organizations believe that detecting internal attacks is significantly more difficult in the cloud than on-premises, according to a recent report. Therefore, implementing appropriate measures to mitigate the internal threat to protect data in the cloud has never been as important as it is today for organizations.

First, remote employees use cloud applications to exchange data, including sensitive data, and could mislead them in insecure locations that could lead to a compliance violation. For example, sharing sensitive data through Microsoft Teams, an increasingly popular application for telecommunications, will result in data dissemination in SharePoint Online storage with a high risk of unauthorized access. In fact, 39% of UK respondents to our recent survey are confident that employees in their organizations share sensitive data through cloud applications outside of IT control.

Second, remote employees often work from their personal devices that are not controlled by the corporate IT team, and as such are more prone to data breaches than their corporate PCs. Such devices often have no patches and are therefore vulnerable to cyber threats. Once an attacker has a foothold on the employee's device, they have a "remote control" and can observe and take advantage of any outbound connections from it. Essentially, they can gain access to all corporate cloud services that the user connects to or even to the local corporate network as soon as the user establishes their VPN connection or remote desktop session (RDP) to any internal server.

Additionally, an employee may lose their device or let other family members use it, resulting in unauthorized access. In some rare cases, employees copy confidential data to their personal devices from corporate cloud storage with malicious intent, which is also a serious security risk.

Under normal circumstances, before asking employees to work from home, an organization should ideally develop appropriate security policies with a specific focus on cloud security. First, it is essential to ensure that all user permissions for storage with confidential data are granted on a "need to access" basis to prevent people with internal access from accessing information they do not need to do their job. .

In addition, it is important to establish effective access controls, as well as efficient identity verification methods, such as multi-factor authentication, which will also protect the confidential data of cloud organizations from unauthorized access.

And last but not least, it is critical that the IT department train employees on the "two" and "no" of the cloud, starting with the principles of dealing with sensitive data and ending with instructions for patching and protect your personal devices. All of these measures need to be implemented on an ongoing basis, with the IT team ready to assist employees with any issues when they work from home, be it an operational or security issue.

If an organization does not know where its confidential data resides in the cloud, it cannot guarantee that remote employees follow security policies. This is particularly challenging since modern organizations use multiple clouds.

In fact, McAfee has calculated that an average business uses around 1,427 different cloud services, while an average employee actively uses 36 cloud services at work. The more cloud services remote employees use, the harder it is for an organization's IT team to track how they handle data. It means an increased risk of misplacing confidential data and the poor public relations and compliance conclusions that come with it. To reduce data overexposure, it is essential to have technologies to automatically discover sensitive data across multiple cloud warehouses and continually classify it according to its sensitivity.

Since the cloud is prone to a wide range of threat vectors for data breaches by experts, it is vitally important that an organization detect these cases in a timely manner. Is it malware trying to break into the corporate network, or is it someone with the intention of stealing the customer database? All these security risks in the cloud, and many others, are accompanied by anomalies in user activity. Therefore, if an organization uses cloud computing and cloud storage, it is important to have user behavior analysis (UBA) technologies that can detect deviations from normal user behavior and alert an IT team. about possible threats in the cloud.

Examples of the most common anomalies that indicate a threat include abnormal login activities (such as attempts to login from multiple endpoints, multiple subsequent logons in a short period of time, and an unusually high number of login failures. session); or data access patterns that differ from the past behavior of the user or that of their peers. It is important to note that switching from office work to remote access will likely cause initial changes in user access patterns. Businesses can expect higher-than-normal numbers of false positives from machine learning-based behavioral abnormality detection solutions in the first few weeks after users move out of their headquarters.

Such measures will help organizations minimize internal cloud threats not only during the world's "biggest work-from-home experiment" as Time has called the COVID-19 outbreak, but also when it comes to an end. . With the subsequent economic downturn likely to follow, cloud computing will continue to be a profitable way to run a business. A sustainable approach to cloud security will allow organizations to avoid unwanted data breaches and heavy long-term compliance penalties.

Matt Middleton-Leal is General Manager of EMEA and APAC at Netwrix

Originally posted here:
How to combat insider threats as organizations increasingly rely on cloud computing for telecommunications - NewsDio

Read More..