Category Archives: Cloud Storage
Moving my data to Azure and Office 365 using Synology Cloud Sync – iTWire
Guest Review: Syncing data between on-prem locations and public cloud has become a very common practice for many organisations, and sometimes even for home users.
I have seen organisations using solutions from storage providers to sync data to Azure Storage Accounts and other cloud providers. Personally, in order to prevent hardware failure, and being able to access my files while Im away from home, Ive also been wanting to migrate some of my data from my NAS to Microsoft OneDrive and Azure Storage Accounts.
A few weeks ago,Synologyreached out to me and asked me if Id be interested to review their freeCloud Syncsolution with Azure Storage. Since Ive been thinking about replacing two 6-year old NAS devices at home and wanting to move some files to Azure and OneDrive, I have accepted the offer. Although Synology supplied me with the NAS device (DS920+), this is not a sponsored post, Im only sharing my opinion based on my own experience.
Synology Cloud Sync is extremely easy to configure. Once youve logged in to the web portal of your Synology NAS, it can be found in the Package Center, you can install it with one click (and follow the wizard).
Once installed, we can start creating sync jobs. Cloud Sync supports many public cloud providers such as Microsoft Azure, OneDrive, OneDrive for Business, AWS S3, GCP Cloud Storage, Google Drive, Dropbox, etc. Ill cover Azure Storage Accounts and Microsoft OneDrive (personal) in this post.
Firstly, I have created an Azure Storage Account in the Australia Southeast region since its the closest region to my home. I connected the storage account to a VNet and added my home broadbands IP address to the firewall rule so the NAS device can reach it (as shown below). This restricts access to the storage account only from the VNet it connects to, my home IP, and other Azure services (Since Ive ticked Allow trusted Microsoft services to access this storage account).
I then created a blob container in the storage account
Then on the Synology NAS web portal, I created a sync job with the following information:
Then I can choose the local path (a share on the NAS), and the remote path (in Azure storage blob), I can also select the sync direction. In this case, Ive chosen bi-directional so changes from both ends will be replicated to each other.
I can also specify a schedule, i.e. stop sync during busy hours to save network bandwidth.
I can still modify the settings after the jobs are created. For example, I can configure the polling interval, set network throttling, folder exclusions, file filter (based on file extensions), add/remove/modify sync folders etc.
Depending on the size of the folder and your Internet link speed, the initial synchronisation can take a while. Once completed, youll see the status as Up to date
At this stage, any changes on the NAS folder or the blob container will be replicated. Ive done some tests (as shown below):
With Microsoft OneDrive, I have 2 real use cases:
Setting up sync jobs for OneDrive is super easier. All you need is to sign in to your Microsoft Account when prompted, and give user consent:
You will be prompted to be redirected to the Synology NAS web portal (a typical oAuth workflow):
All the other settings are the same as the Azure Storage, you can choose sync direction, scheduling settings, pulling interval, network bandwidth throttling, etc.
For the OneDrive connection, I performed similar sets of tests that I previously performed for the Storage Account, the behaviour is very similar locally initiated changes trigger synchronisation which gets replicated to OneDrive as soon as the changes are made. However, changes on the OneDrive dont seem to get replicated to the local folder as quickly as when syncing with Azure Storage Account. the default polling interval for OneDrive is 600 seconds (every 10 minutes), I tried to decrease it to 15 seconds, but it doesnt seem like Cloud Sync is polling OneDrive every 15 seconds as configured. the files did appear on the NAS share after around 10 minutes though. This is not a big deal, I can live with it.
Overall, Im pretty happy with the feature Cloud Sync offers.
For Microsoft OneDrive, although the polling interval is a little bit too long in my opinion, it is perfect for what I need to achieve. Moving forward, I can definitely see myself setting up more and more folders to sync, store a local copy of my OneDrive folders on the NAS so I dont have to keep cleaning up spaces from the SSDs on my PCs because once youve accessed a file via OneDrive client, the file gets downloaded and stored on your PC permanently.
With Azure Storage Accounts, in my opinion, since Synology NAS devices are generally used by home users and small to medium businesses, it offers a very cost-effective way to migrate/synchronize files to cloud platforms. The configuration is pretty easy and the synchronization is pretty effective based on my testing. However, for large enterprises, I believe its missing some features:
The documentation for the Synology Cloud Sync can be found here:https://www.synology.com/en-global/knowledgebase/DSM/help/CloudSync/cloudsync.
Lastly, Id like to thank Synology for offering me this great device to work with. Ive already loaded it with 4x12GB HDDs and Im currently in the process of migrating my files to and from other NAS devices and various cloud storage.
We work with you to develop the message and conduct the interview or product review in a safe and collaborative way. Unlike other Tech YouTube channels, we create a story around your message and post that on the homepage of ITWire, linking to your message.
In addition, your interview post message can be displayed in up to 7 different post displays on our the iTWire.com site to drive traffic and readers to your video content and downloads. This can be a significant Lead Generation opportunity for your business.
We also provide 3 videos in one recording/sitting if you require so that you have a series of videos to promote to your customers. Your sales team can add your emails to sales collateral and to the footer of their sales and marketing emails.
See the latest in Tech News, Views, Interviews, Reviews, Product Promos and Events. Plus funny videos from our readers and customers.
SEE WHAT'S ON ITWIRE TV NOW!
More:
Moving my data to Azure and Office 365 using Synology Cloud Sync - iTWire
Box: Cloud and data are important, so its time to join – Illinoisnewstoday.com
Over the last 18 months, companies across Europe have had to make many changes to the way they work and collaborate, and the demand for online services is higher than ever.
box A comprehensive suite of tools and services that diverged from traditional tools and services has benefited from this surge in demand. Cloud computing And Cloud storage We are also working on content management.
Therefore, it seems that the companys new EMEA president, Sbastien Marotte, has chosen the best time to join. TechRadar Pro He talked to Marotte about how he settled into a new role and plans for a box across the region.
After spending the last decade at Google, Marotte joined Box and was responsible for growing his enterprise business from scratch.
However, Marotte says he was attracted to Box because of the companys willingness and ability to move from traditional storage to content management. This is moving to something completely different, but its important and central to all business, he says.
Im from a world that has been imposing the message that cloud and data are important for the last five years and extending it to content was very comfortable for me, he said. I will.
We are serious about providing solutions that solve business problems, rather than selling technologies and features. Companies are committed to properly hosting all content connected to their IT systems. , Im convinced that I need a single platform. Level of security and Box brings it to the table.
Marotte emphasized how Box has serious ambitions with respect to EMEA, and the company recently added many prominent customers, such as BT, to its roster.
Not only does he plan to double investment across the region, he says he will continue to support Boxs already strong sales strategy and partner ecosystem.
Especially because Box has a stable enterprise product Hybrid work It is becoming more popular all over the world.
This is a great opportunity for us, says Marotte. The new way of working will be a very hybrid model, so every employee must have the right tools to access the content they need to run their business.
Box expanded into the cloud collaboration market a few years ago and is well equipped to meet these needs. Mariotte adds that by keeping all content in a single platform, enterprises can not only access their data, but also securely access it from anywhere. Device at any time of the day.
He said that the growth of SaaS applications is especially Remote workCan provide enterprises with serious security challenges to overcome and add unnecessary technical debt, but if you can find a way to break silos and protect your data seriously, its serious for Box. This is an opportunity.
Marotte concludes that enabling integrated workflows can enable businesses of all sizes to work smarter and harder, making Box a key partner for businesses across the EMEA region. I am aiming to hire one.
There is no doubt that we can dramatically improve efficiency and productivity, he added.
Read the rest here:
Box: Cloud and data are important, so its time to join - Illinoisnewstoday.com
Pure Storage rides the hybrid cloud wave to growth in NZ – Reseller News
All flash storage pioneer Pure Storage is riding the wave of hybrid cloud adoption to success both locally and globally.
Founded in 2009, the company arrived in New Zealand seven years ago and is still led by its first local employee, Stuart Blythe.
While not willing to break out local employee numbers, Blythe said the team now covers sales, presales and the channel. But it is Pure's partners that do the heavy lifting in the market.
"Globally, the company is 100 per cent channel," Blythe said. "We dont sell directly to end users. Everywhere outside of the US is two-tier via distributors."
In New Zealand, that distributor is Westcon.
"We are trying not to saturate market with resellers," Blythe told Reseller News. "It's a value based sell, but the major suspects are infrastructure and data centre focused."
Many are also users of Pure's products as well, with MSPs and SaaS vendors being the company's biggest verticals.
Pure enjoyed strong differentiation as one of first vendors to come to market with all flash products.
In the beginning it focused on customer usages and workloads that could benefit from that because flash costs back then were way more expensive than they are now, Blythe explained.
Consumption of flash is now mainstream, he said. It is just defined as "performance" storage, the Tier 1 for application workloads.
"As that has shifted, the share and visibility and opportunities we have engaged with are much broader than seven years ago," he said.
That has been helped by what he said are unique capabilities in data reduction to use raw NAND flash very efficiently.
With flash now available at lower price points, the total addressable market has also expanded into Tier 2 as well.
"When we brought all flash to market, we did it in a different way," Blythe said. "The founders realised there was a shift."
Other players were retrofitting flash-based storage into disc-based operating systems, but Pure engineered its platform from scratch with advanced data reduction capabilities.
The end-product was 1.5- to two-times more efficient.
Because it was built from the ground up on flash we were able to reimagine the customer experience from an architecture and simplicity perspective," Blythe said.
Simplicity was critical, especially to support and orchestrate the transition from on premises to cloud.
The architecture also had to be non-disruptive to enable upgrades and other changes without down-time.
That architecture also supported a new "evergreen" storage business model that eliminated lift-and-shift upgrades and a price hike every few years.
Blythe said customers can buy an array with three years of support. The gold support option included a non-disruptive upgrade including new controllers in the existing hardware with no additional cost over the next three years.
"It's still capex plus support but once you've bought the asset, you never rebuy it," he explained. "You keep buying maintenance while also subscribing to hardware and software innovation."
As of July, more than 2700 customers globally had experienced non-disruptivestorageupgrades and the average number of upgrades grew 38 per cent year-over-year for the last five years, Pure said.
There was an interesting dynamic emerging in the market with the ascent of hybrid cloud, Blythe said.
Public cloud doesnt necessarily deliver performance, availability and a commercial outcome simply by virtue of a lift and shift. Customers still want very high availability and performance.
Infrastructure-as-a-service (IaaS), for instance, is mostly focused on on-premises style workloads. It is almost a traditional technology stack rather than cloud native.
"A lot of people say going on a cloud journey but need to understand what that means," Blythe said. "It is absolutely becoming a hybrid world."
Rapid recovery, however, is the topic du jour with ransomware attacks both professionalising and proliferating. The ability to get data back very quickly was an emerging flash storage use case.
"We have customers using flash for back up," Blythe said. "It's counter intuitive for speedy recovery."
In essence, flash has spread from tier 1 all the way down to tier 4.
To address the needs of software-as-a-service (SaaS) vendor customers, Pure built an "immutable" snapshot capability into its products. Even administrators cant get access to it.
Because ransomware infiltration typically happens weeks even months before a ransom demand is made, the ability to keep point-in-time snapshots and to allow rollback and recovery at high speed is being seen as very beneficial.
Winning customers has proved key to winning partners, Blythe said.
"We have a lot of the traditional partners with long standing vendor relationships that they are happy and comfortable with and dont want to disrupt," he said.
"But when you start winning a couple of customers off them, you start being taken notice of. They come based on the success they see we are having."
In that context, customer satisfaction is key and Pure puts great store on net promoter score (NPS). That also puts the company in Tier 1, with NPS sitting at around 83.5.
Local customers include TSB, Toyota Financial Services, Ballance Agri-Nutrients, BCS Group and Kensington Swann.
Last September, Pure bought Portworx a data storage and management specialist focused on the Kubernetes market.
That delivered another set of partners with capabilities, services and value that were not based around data centre infrastructure.
"They are very much focused around concept of data mobility cloud is not a destination," Blythe said. "Portworx will give that abstraction to make data mobile in the cloud."
The goal there was to give customers both an on-premise enterprise experience that was more cloud-like and to make some cloud based infrastructure operate more like enterprise storage.
Error: Please check your email address.
Tags storageflash storagePure Storagehybrid cloudCloud
Link:
Pure Storage rides the hybrid cloud wave to growth in NZ - Reseller News
Is Google reading content of files you upload to Google Drive? – TWCN Tech News
After Google made an announcement that its cloud storage service Google Drive will ban the distribution of misleading content, a flurry of comments with speculation and synopsis has been doing rounds. Some strongly believe, under the garb of banning the distribution of misleading content Google is reading content uploaded by users to the Drive.
Firstly, we should note that the decision taken is not about private files but its distributing content. Google doesnt necessarily spy on its users private files but scans them when you share them publicly. For example, if someone keeps all the pirated movies, he/she wants on his/her Drive, and gives private access to friends, Google will instantly restrict its access after scanning its contents. As such, this scheme doesnt apply to your private files or privately shared documents They are only acting only on complaints.
We need to curb abuses that threaten our ability to provide these services and we ask everyone abide by the policies to help us achieve our goal. After we are notified of a potential policy violation, we may review content and take action, including restricting access to content, removing the content and limiting or terminating users access to Google products, reads the Abuse Program Policies and Enforcement of Google Docs.
Secondly, it is worth pointing out that cyber-criminals make sincere efforts in keeping their communications secret instead of hosting them in plain text on the servers of tech giants like Google.
We agree, theres no such thing as absolute free speech with no limits. However, a reputed search giant like Google cant go that far (reading your private files) in preventing misinformation. The maximum they can do is disallow things that could have a direct bearing on the democratic setup.
Whats your take on the story? Share your thoughts with us in the comments section below.
Source Hacker News.
Read this article:
Is Google reading content of files you upload to Google Drive? - TWCN Tech News
Bottom Line: When will ransomware attacks hit the Upper Valley? They already have – Valley News
No longer is it just a matter of time until an Upper Valley institution, business or town gets hit with a ransomware attack. Its already happened. Cybersecurity experts say it will keep happening, and anyone who depends on a computer network to run their business, school or town in other words, everyone should be prepared.
Yes, theyve happened. Can I talk about them? No. But they happen, said Ray Coffin, founder of All-Access Infotech, a Fairlee information technology consultant who builds and manages IT systems for small and medium businesses in the Upper Valley. Its at the forefront of every conversation were having.
Unless youve been living off the grid (and some do in the Upper Valley) and are blissfully unaware, barely a day passes when a business if not an entire industry is held hostage by a ransomware attack. Its a thriving extortion racket: One study estimates that a total of $406 million in ransom money was paid out to perps in 2020, up 337% from 2019.
The M.O. is familiar: A shadowy group many are said to emanate from inside countries like Russia, Iran and North Korea who are hostile to the U.S. seizes control of a targets computer networks and demands money be paid before supplying the key that unlocks the seized network.
Prominent recent ransomware examples include the attack on the Colonial Pipeline, which carries gas to the East Coast and was shut down until the operator paid $4.4 million. Another attack on JBS, which processes 20% of the countrys meat supply, led to a payment of $11 million to bring its plants back online.
When I thought about which businesses in the Upper Valley might be smart about mitigating against the risk of a ransomware attack, Hypertherm was the first to come to mind.
The Hanover-based, employee-owned company is a world-class manufacturer of plasma and waterjet cutting technology.
Hypertherm sells a hefty percentage of its products in the international market and relies upon a global supply chain for materials, thereby raising its risk profile because bad actors could have numerous entry points into its networks.
And, I learned, Hypertherm was an early ransomware victim.
Back in 2010, we were hit three times in less than a year, and it took down production for a half a day, said Robert Kay, IT chief at Hypertherm. We did not pay any ransom and were able to use our backups to restore operations, but it became clear this was a problem we had to address.
The ransomware attack, Kay said, kicked off an action plan that reviewed everything from the companys IT infrastructure to employee interactions with company systems that elevate risk. Kay declined to name specific measures, but one of the actions it has taken is to bring on a security expert with advanced training who has been qualified to join in FBI briefings on cybersecurity threats.
The in-house cyber specialist is also a certified ethical hacker that allows them to be trained in the latest hacking techniques and skills in order to penetrate the companys computer operations to discover vulnerabilities and fix them.
We get attacked often, Kay said. But so far, thanks to the seriousness in which Hypertherm has responded to the threat, we havent been impacted.
The company also carries ransomware insurance, he said.
In a scenario perhaps most relevant for the Upper Valley, the computer system of Leonardtown, a small town in rural Maryland, was shut down after it was exposed to a ransomware attack through the vendor that operated the towns IT system, which in turn relied on software of a targeted company.
Although the town itself was not directly attacked, the incident destroyed the data files the town used to meet its payroll and send out quarterly utility bills to its 3,000 residents.
Lebanon City Manager Shaun Mulholland said that kind of situation is one of the reasons he prioritized switching IT firms and beefing up the citys internal IT department shortly after he took over in Lebanon in 2018.
After an assessment of the citys IT infrastructure found significant weaknesses, they had to totally revamp the whole system, said Mulholland, a former police chief in Allenstown, N.H.
The city spent $750,000 to upgrade IT security, including a new computer system that operates the citys water and sewer plants.
There were a lot of things people could hack into, he said.
And although Mulholland said Lebanon has not been the target of ransomware attack, the city is regularly inundated with so-called phishing attacks that attempt to trick city employees into revealing their passwords in order to hack into email and other accounts.
Now that Lebanons cybersecurity has been improved nobody is 100% secure, Mulholland acknowledged the next step will be to conduct tests with city employees by a cybersecurity firm that will check how on guard city workers are about protecting passwords and information that could result in a bad actor hacking into the citys computer networks, Mulholland said.
Mulholland explained the testing will be to ensure city employees are following protection protocols and to coach them if they make mistakes and not to discipline anyone over errors.
Nobodys going to get into trouble, he said.
Most small, mom-and-pop businesses do not have Lebanons budget to plug holes in their computer systems, but there are still things they can do to minimize the risk of a ransomware attack, according to IT consultant Coffin.
Make sure all your data is backed up on a cloud provider and cloud storage, Coffin said, explaining that if a business finds it is locked out of its data files it can easily pivot to the backup files and will not be compelled to pay the attacker for the key to get the data back. The only data the business would lose is the data since the last backup procedure.
Of course, a business has to pay a cloud storage provider like Amazon or Microsoft and, ranging in cost anywhere from less than a hundred dollars per month to $1,000 per month depending on the amount of the data to be stored, that can be a large expense for a small company, such as a farm stand or handcrafts maker with an online sales platform.
But skimping to pay for protection may only lead to bearing a steeper cost later.
It should be looked at like rent, one of those expenses in the budget line, Coffin said.
Contact John Lippman at jlippman@vnews.com.
See the original post here:
Bottom Line: When will ransomware attacks hit the Upper Valley? They already have - Valley News
Healthy Komprise doubles revenues and partners AWS in health sector Blocks and Files – Blocks and Files
In corporate wellness news, Komprise has doubled its revenue in the first six months of 2021 and is partnering with AWS to sell cloud-tiering data services into the health sector.
The company sells data management lifecycle technology, which can identify ageing, less-accessed files and move them to lower-cost storage tiers, including Amazons S3 and S3 Glacier cloud vaults. Komprise has an Elastic Data Migration offering, which provides file data migrations to Amazons Elastic File System (EFS) and FSx for Windows File Server, also Azure Files. Users access files from their original locations, and can access their data in AWS, with the option to access it directly versus rehydrating files back to the primary storage.
Komprise says first half 2021 revenues rose 97 per cent year on year and it had 190 per cent new customer growth and 200 per cent average deal size growth. That is healthy.
CEO Kumar Goswami said in a statement: Customers are adopting Komprise because we not only find and move the right data to the cloud, but we tier data without users and applications noticing any change and without locking data in the cloud in a proprietary format.
The company announced it has been awarded a patent that extends the capabilities of its Transparent Move Technology (patented in 2019) to enable asynchronous restoration of files from delayed recall storage such as tape. This patent was a joint application with tape system and secondary storage vendorSpectra Logic.
We think a Komprise and SpectraLogic partnership marketing initiative might hit the streets later this year.
The health sector partnership with AWS builds upon a deal with pharmaceutical giant Pfizer. Komprise says it helped Pfizer stop 20 years of increasing storage costs and leverage its data tiered to AWS for research, without changing how users and applications access their files. A July 22 AWS webinar will discuss Pfizers use of Komprise and AWS cold storage technology.
Komprise was started up in 2014 and has taken in a relatively small $42 million in funding, with the last round taking place in 2019 and raising $24 million. This is small potatoes compared to data protection and management startups like Cohesity ($660M) and Rubrik ($552M+) but on a par with other file lifecycle management startups like StrongBox ($27M).
The three founders are Goswami, President and COO Krishna Subramanian, and CTO Michael Peercy. The threesome set up Kaviza to replace SAN storage in VDI and sold it to Citrix in 2011. Previously they founded Kovair, a software tools company which is alive and prospering.
Komprise has partnerships with HPE, Pure Storage, and works with AWS, Azure and NetApp Cloud Volumes. It clearly has tech that works with other suppliers kit.
We think Komprise could possibly IPO, but its more likely that it will be acquired for its file scanning, indexing, transparent move and analytics technology. It would be a good fit for for any larger IT supplier looking to move into hybrid cloud data management. Dell is aiming to move into the data management market. Just sayin.
See the article here:
Healthy Komprise doubles revenues and partners AWS in health sector Blocks and Files - Blocks and Files
Shailesh Haribhakti discusses audit renaissance and the deployment of cyber and digital security measures – Free Press Journal
Boards across the world now recognise that nothing short of an audit renaissance will make them feel satisfied about their oversight on cybersecurity challenges. The feared trillion-dollar number has entered the fear factor gauge as infrastructure breakdowns, halting of operations, ransomware demands and egregious data leakages have grabbed headlines all over the world. Some of the most sensitive organisations in the world have fallen prey, despite massive investment in cybersecurity!
The basic three-part renaissance required can be summarised as follows:
1. Raise global awareness about the subject: Use examples, videos, drawdowns from repositories, sessions by experts and a cutting-edge self-study module available for widespread free usage.
2. Build a culture of safety: Nothing short of global cooperation will work. All incidents, patches, clever attempts to steal, closed down operating assets and restarting strategies must be uploaded to a global repository. Access to the repository must be authorised, universal and uninterrupted. Custodians for this repository should be Central banks of the largest 10 nations on earth, by rotation. All tools, protocols and frameworks that create safety must also be universally shared.
3. Build human and mechanical competence to detect early and counter threats: No lags in continuous monitoring and auditing should be tolerated by the system. Any post facto checks can only be useful as future learnings about attempted attacks. Any breach is too costly to afford and therefore must immediately be uploaded to the repository. As the repository is a true universal asset, it will acquire the status of being protected, curated and shared universally.
Only an establishment with infrastructure of this quality will support unstoppable enhancement in computer power, as quantum computing comes online. Storage and retrieval systems will also have to be constantly kept in a state of accelerated improvement. The battle between the forces of good and the evil will have to be transported to cyberspace. Knowledge and vigilance must trump greed and fear!
I invited three organisations whose boards I chair, to share their policies and practices. Am here, sharing these practices which have evolved over years of effort to serve as examples how all can learn and improve by sharing:
Lessons from Blue Star Limited
Cybersecurity risk management is a process of swift detection of emerging risks, assessing their potential impact, and determining how to respond in an agile manner if those risks materialise. A cybersecurity management strategy is kept refreshed at all times, as experience builds.
Effective cybersecurity risk management happens on a continuous basis, both at cultural and operational levels.
Blue Star has enhanced its cyber risk management framework through the following initiatives:
Establishing Culture
While developing a cybersecurity risk management programme, the first thing to initiate is embedding it in the companys culture. The average cost of a cyberattack is approximately $1 million, and 37 per cent of organisations attacked have had their reputation tarnished as a result of the attack. This is why a cybersecurity-focused culture must be established at all levels in the organisation, to prevent loss.
An important aspect is guarding against vulnerable human behaviour. This is done by adequate training and awareness to recognise phishing emails and other social engineering attacks.
Security Operations Centre (SOC)
Blue Star implemented Security Operations Centre services that house an information security team responsible for monitoring and analysing the security posture on an ongoing basis. The SOC team works closely with the organisation incident response team, to ensure that security issues are addressed quickly upon discovery.
Benefits of SOC to Blue Star:
1. Monitoring of security-related incidents round the clock and correlating them with global emerging threats.
2. Proactively hunting for targeted attacks, advance threats, and campaigns.
3. Developed the ability to ward off a ransomware attack
4. Reduction in the incident investigation and remediation time.
Vulnerability Assessment and Penetration Testing (VAPT)
Periodic comprehensive VAPT testing is a strictly disciplined activity. This includes Application Security review, Wi-Fi Penetration testing, Infrastructure Penetration Test, Endpoint Security Review and Secure Configuration Review for Servers & Networks.
Secured Websites
Deployed SSL certificates for web portals; security standard compliance extended to software partners.
Information Security Policy
A set of policies and procedures has been formulated to ensure users understand and comply with a set of guidelines on handling of information stored within Blue Stars network and systems.
Information Rights management tool
Data residing in unsecure locations is accessible to individuals who must not have access to it. This is a common use case within any organisation, where unintended user groups gain access to data. Such a situation may cause data leakage to parties which do not have the organisations best interests in mind.
Blue Star has deployed Seclore software, to protect sensitive information flow. This helps to protect sensitive data that is shared between internal users and user groups m. Pre-defined permission policies to documents stored in file repositories and file server folders are in place. When a document is added to the repository or the folder, permissions for print, copy, forward are attached to the document. Only certain groups of users are allowed access to sensitive documents.
Protection during Internet Access
Data on employees laptops are protected at all times. Even when employees are outside the Blue Star network i.e. when they are accessing the Internet over less secure and vulnerable public Wi-Fi connections or from home. An intelligent guard is installed carefully to protect against malicious websites, viruses, worms and Trojans. This is especially important when almost all of our organisation is working remotely.
Also, there might be incidents when some of us inadvertently access links that may be malicious. This is where the Zscaler Cloud Proxy tool kicks in to guard employees machines while accessing the Internet. The tool also offers a dashboard that provides important MIS on overall security and usage.
Backup and restoration
Blue Star has enhanced its data protection by introducing an enterprise class back-up and restoration tool to retrieve data during any cyber or other disruptions.
Insurance Policy
Cyber Insurance Policy has been obtained, to protect the company from loss incurred from corruption of its data from unauthorised software, computer code or third-party data, wrongful appropriation of network access code, disclosure of third-party data by the companys employees etc.
Cybersecurity insight from L&T Financial Holdings Ltd
The potential data loss from a hack per company could run into millions per year. One failure to defend against a hack can spell disaster. Most of the attempts get repulsed at the external firewall-level itself.
Key aspects of defence (It is more or less like Army defence of land):
1. Be aware of possible avenues of breach. Examples are third party APIs, vendor access to systems etc. These are more vulnerable.
2. Invest proactively to strengthen the posture of defense.
3. Create awareness among all employees on Cybersecuritys importance and reduce chances to accidentally or intentionally leak information outside. Access control and development codes are held in code repository instead of individual machines.
4. Have multi-layered architecture to ensure that the attacker, if successful, does not get deep within.
5. Everyone has a role to play in defence and it is not only the cybersecurity teams job. While that team leads the effort, others have to complement.
6. Regular sharing of practices among companies. This builds overall environment against attackers and they get less encouragement.
System malfunction is curtailed. Despite security checks which may increase the per transaction time taken are weeded out continuously as new techniques become available.
Access controls might deny usage option to genuine users sometimes. Potential mitigants that we apply are as under:
1. Sanity testing of production systems before making it live.
2. Performance testing post implementation of information security controls with simulated traffic in pre production environment.
A critical aspect is: How exactly does information security get staffed? For most of the evolved functions, a separate layer which conducts audit is deployed i.e. internal audit and statutory auditors. Information security must avoid inherent conflict of interest, as providing security and audit are separated.
Information security is a new function but slowly Internal audit function is being beefed up through reskilling Statutory auditors also have to pick up the slack as they get into ESG and technology driven continuous audits.
Insights from NSDL e-Governance Infrastructure Limited
There are six pillars around which IT security has been thought through. They are :
IT Infrastructure security
Application security
Endpoint security
Third-party risk assessment
Business resilience and
Security governance.
1. IT Infrastructure security - covers aspects like server patching, network security, firewalls, access etc. for both cloud and on-premises infrastructure. This is a monthly activity to update all patches and secure all bases.
2. Application security - covers all APIs, mobile applications and all existing workflow applications. All changes have to be first cleared through information security and the testing of production environment is also done.
3. Endpoint security - since we are BYOD company, basically this operates under zero-trust policy. Tools are deployed to ensure the checkpoint between device and our network layer. Also, monitoring of end device is in place.
4. Third-party risk - we have a large ecosystem of third parties comprising of fintechs, bureaus, call centres, vendors and other technology partners. We try to have controls over them through either direct control using audits, or we give them pointers for self-certification. Self-certification is used in case of large companies only.
5. Business resilience - basically, around ensuring applicability of DR or ensuring that applications are in high-availability mode to ensure business continuity in case something goes wrong.
6. Security governance - last but not the least, regular review on our status. Monthly security posture review by CDO and CRO. In addition, this also gets reviewed at Board committees of RMC and IT strategy.
Some of the important cyber and digital security measures deployed are:
1) Global Standards and frameworks that are most widely and successfully used. A yearly update is mandatory.
2) Multilevel, defenceindepth security architecture deployment. Data traffic is subjected to at least 4-5 levels of scrutiny / checks (using different methods) before it reaches the main system.
3) Daily automated scanning of application systems and infrastructure is done to early detect any new known vulnerabilities. Findings are reviewed / verified and an action plan defined to fix these vulnerabilities. Counter-measures such as Web Application System (Machine learning based) are deployed for preventing the exploitation of vulnerabilities that need time to fix (due to upgradation of version or application dependency).
4) Security posture (attack surface assessment) and benchmarking against the peers in the industry is carried out using automated platform-based services. A real-time dashboard helps regular monitoring and planning of action to maintain / enhance the posture.
5) Zero trust approach Role-based access is followed. Internal users also dont get to access the system directly. Firewall rules determine who will be allowed access. Privileged users dont have access to credentials. Intermediate system logs using securely stored credentials and each action is logged/ anonymised.
6) Industry standard key strengths and algorithms are adopted. This applies to all three phases, data in motion, data at rest and data in use.
7) Unstructured data is monitored based on the policy defined by the respective data owners. Data leak prevention systems block the data, disallowing its transfer through any channel (removable storage, web based storage, print or email).
8) Emails contain critical information, as these are the most preferred channels of communication. Therefore, email on mobile is provided only through separate secured container within users' mobile devices. This provides features such as disallowing copying data attachments outside the container, taking screenshots etc. If email is forwarded, DLP rules would apply.
9) Data traffic of all the above technologies / devices is monitored 24 X 7 with help of state-of-the-art tools and fine-tuned processes and skilled resources. Correlating events, detecting anomalies and triggering a ticket to resolver group is an automated process.
10) Well-thought-out cybersecurity / information security policy and process are deployed to ensure uniformity of action to meet the organisation security objectives. Continuous review and finetuning is undertaken to ensure robustness. Review is done up to the board level for critical cybersecurity policy.
11) Continuous security awareness training is provided to all the employees of all levels. Awareness sessions are conducted for top management and board members.
12) All these controls are audited on continuous bases by internal auditors / independent experts as well as the certification auditors and reported to the audit committee of the board.
Cybersecurity is receiving adequate attention at the highest levels and awareness is getting widespread. The battle is on. Winners will be the diligent and vigilant.
The writer is a corporate leader based in Mumbai. He is a chartered and cost accountant and writes regularly on the Indian economy and public policy
Read the original:
Shailesh Haribhakti discusses audit renaissance and the deployment of cyber and digital security measures - Free Press Journal
Cloud storage 101: How to back up only the files you need to – Komando
When it comes to important files, many of us have everything backed up directly to the cloud. But when you back up everything, you can fly through your allotted storage very quickly.
If storage isnt an issue, you still may want to exclude certain files from the cloud for safekeeping. Most services give you a few options to be more selective about what you save and what you dont.
Our sponsor, IDrive, for example, allows you to exclude certain folders completely from being synced. If you dont want to exclude an entire folder, you can instead use an option called selective sync to choose what goes and what stays. Well show you how to use this helpful feature.
IDrives cloud-based data backup solution is regarded as the best in the business. Its online, offsite and accessible 24/7 from any device even your cell phone! If the worst does happen, you can be back up and running in no time with IDrive.
GET SMART: Here are 4 reasons you need to be using a cloud backup for all your devices
Privacy, security, the latest trends and the info you need to live your best digital life.
This is the part youre going to love most: IDrive is super affordable. Plans start at just a few bucks a month, and if you use the promo code Kim during sign-up, you can get a whopping 90% off your first year.Tap or click here to get started.
IDrives plans come with 5 TB of cloud storage so you can back up unlimited PCs, Macs, iPhones, iPads and Androids to a single account Thats a ton of space. Just ONE terabyte is equal to 86,899,345 pages of Word documents, 500 hours worth of movies, 17,000 hours of music or 310,000 photos. Unless you have a crazy amount of data, 5 terabytes will work just fine.
Try out IDrive now while youre thinking about it.Save 90% when you sign up at IDrive.com and use the promo code Kim at checkout. Thats less than $7 for your first year!You really cant beat that price.
If you are using IDrive as your cloud solution, you can choose to exclude certain photos from being backed up on its servers.
There are multiple ways and options to do this using the main window:
Choose a specific file or folder
Using full path names
If you dont want to exclude an entire folder, you can use a different process known as selective exclusion. This feature allows you to choose the folders you want to sync instead of excluding those you dont want.
This feature is beneficial for those who want to keep most of their information available locally and only back up crucial files.
These features allow you to be selective about whats being saved to your backup so you can stretch your storage as far as possible. Looking for other storage solutions? Here are three places you can store your photos.
Read this article:
Cloud storage 101: How to back up only the files you need to - Komando
Indians want basic cloud storage that fits their pockets: Digiboxx CEO – The Indian Express
For Arnab Mitra, the CEO of homegrown cloud storage solution platform Digiboxx, there hasnt been a better time to move to cloud storage. With the world adapting to the work-from-home culture, the usage of smartphones has increased leading to an increase in the demand for cloud storage, he tells indianexpress.com about what he calls an extremely future-focused industry.
Launched in December last year, Indian cloud-storage service Digiboxx offers both free and paid cloud storage plans for individuals and businesses and has already crossed a million users. The service competes with Google One, Microsofts OneDrive and Dropbox by offering relatively affordable plans.
Mitra explains that their success can be attributed to understanding what most Indian customers need when it comes to cloud storage. Most of the people in India do not communicate in English and fail to understand a very complex platform. The majority of them just want a basic service that will allow them to sync their phones, upload photos, videos, chats, and backups, he states, adding that most essentially, people want something that just fits into their pocket.
Digiboxx went live around the time Google announced the end of its unlimited storage space for Photos.
While Google One subscription with storage plan starts at Rs 130 per month for 100 GB, DigiBoxx offers the same 100 GB at Rs 30 per month, along with other packages for storage upto 2TB. Meanwhile, its free plan has 20GB storage, in comparison to the 15GB storage that Google offers.
But Digiboxx isnt just targeting individual users. It has plans for Small and Medium Businesses (SMBs) starting at Rs 999 with up to 50TB storage and a 10GB maximum file size.
Mitra explains that cloud storage is more flexible than traditional physical storage thanks to its ability to create a more tailored solution for individuals as well as businesses. The package can be customised as per the need of the company, number of employees, etc, he adds.
Cloud storage also makes elements like data recovery more likely for individuals and businesses. Depending on the cause of the problem, it takes longer for traditional storage solutions to recover. Sometimes one can even lose files completely for a physical storage solution but on cloud storages thats impossible, he claims.
Apart from lucrative pricing, Digiboxx also boasts of local storage, with all of its data stored within Indian borders. DigiBoxx has connection encryption and all the files stored on its platform are encrypted at a database level. The service offers support for SSL file encryption.
We are working with multiple Indian Data Centres to assure that the data is being sorted within the countrys borders only, Mitra states, adding that the platform is the first of its kind Make in India, Store in India digital asset management SaaS product that is in line with the countrys national security and data localisation priorities.
View post:
Indians want basic cloud storage that fits their pockets: Digiboxx CEO - The Indian Express
Banks now rely on a few cloud computing giants. That’s creating some unexpected new risks – ZDNet
Outsourcing key banking data and services to a small number of cloud service providers means that those providers have the power to dictate their own terms.
Banks' growing reliance on cloud computing could pose a risk to financial stability and will require stricter oversight, according to top executives from the UK's central bank.
In a report focusing on financial stability in the UK over the past few months, the Bank of England drew attention to the increasing adoption of public cloud services, and voiced concerns about those services being provided by only a handful of huge companies that dominate the market.
Outsourcing key banking data and services to a small number of cloud service providers (CSPs), said the Bank of England, means that those providers have the power to dictate their own terms, potentially to the expense of the stability of the financial system.
For example, cloud providers might fail to open up the inner workings of their systems to third-party scrutiny, meaning that it is impossible for customers to know if they are ensuring the level of resilience that is necessary to carry out banking operations.
"As regulators and people concerned with financial stability, as (CSPs) become more integral to the system, we have to get more assurance that they are meeting the level of resilience that we need," Andrew Bailey, the Bank of England governor, told reporters in a press conference.
In the past years, financial institutions have accelerated their plans to scale up their reliance on CSPs. From file sharing and collaboration to fraud detection, through business management and communications: banks have used cloud outsourcing both to run software and access additional processing capacity, and to support IT infrastructure.
Until recently, cloud services were used mostly to run applications at the periphery of banking operations, such as HR systems with no direct impact on financial services. According to the Bank of England, however, this is now changing, with CSPs being called in to process operations that are more integral to the core running of banks.
"We've crossed a further threshold in terms of what sort of systems and what volumes of systems and data are being outsourced to the cloud," said Sam Woods, the chief executive officer of the Prudential Regulation Authority (PRA). "As you'd expect, we track that quite closely."
Last year, the Bank of Englandopened bidding for a cloud build partner, with the goal of creating a fit-for-purpose cloud environment that could better support operations in a digital-first environment. At the time, the institution said that it had already been in talks with Microsoft's Azure, Google Cloud and Amazon's AWS, and that it would likely be targeting Azure in a first instance. The possibility of adopting a multi-cloud strategy was also raised.
There are many benefits to moving financial services to the public cloud. For example, while using old-fashioned, on-premises data centers incurs extra expenses, a recent analysis by the Bank of England estimated thatadopting the ready-made services offered by hyperscalers could reduce technology infrastructure costs by up to 50%.
Another advantage of public cloud services is that they are more resilient. The sheer scale of CSPs enables them to implement infrastructure that integrates multiple levels of redundancy, and as such, is less vulnerable to failures.
Moving to the cloud, therefore, is not intrinsically detrimental to banking services quite the contrary. But the main sticking point, according to the regulators, lies in the concentration of major players that dominate the cloud market. According to tech analysis firm Gartner's latest numbers,the top five cloud providers currently account for 80% of the market, with Amazon holding a 41% share and Azure representing nearly 20% of the market.
"As of course a market becomes more concentrated around one supplier or a small number of suppliers, those suppliers can exercise market power around of course the cost but also the terms," said Bailey.
"That is where we do have a concern and do have to look carefully because that concentrated power on terms can manifest itself in the form of secrecy, opacity, not providing customers with the information they need in order to be able to monitor the risk in the service. And we have seen some of that going on."
As Bailey stressed, part of the reason for CSPs to remain secretive comes down to better protecting customers, by not opening up key information to potential hackers. But the regulator said that a careful balance has to be maintained on transparency, to enable an appropriate understanding of the risks and resilience of the system without compromising cybersecurity.
Leighton James, the CTO of UKCloud, which provides multi-cloud solutions to public sector organizations across the country, explains that these issues are not unprecedented, and it is unsurprising to see them trickle down to the financial services.
"We're anxious about cloud providers becoming so big that the terms and conditions are pretty much 'take it or leave it'. We're definitely seen that happening already in the public sector, and we can definitely see it happening in the financial services sector if we are not careful," James tells ZDNet.
According to James, part of the risk stems from traditional banks attempting to compete against new disruptive players in the sector. Financial institutions are now rushing to overhaul their legacy infrastructure and catch up with the digital-native customer experiences that were born in the cloud and are now widely available thanks to fintech companies.
"It's clearly imperative for the financial sector to modernize and adopt digital technologies," says James. "The question becomes how best they can do that by balancing the risk of digital transformation."
And in this scenario, the risks of placing all of banks' eggs in a handful of CSP's baskets is too high, argues James.
The Bank of England has similarly urged financial institutions to exert caution when developing their digital transformation strategies, and is currently in talks with various regulators to discuss how to best tackle those risks.
With cloud concerns widelyshared by other nations, especially in the EU, those discussions are likely to become international, and the UK's central bank predicts that global standards will be created to develop a consistent approach to the issue.
Original post:
Banks now rely on a few cloud computing giants. That's creating some unexpected new risks - ZDNet