Category Archives: Cloud Storage
Storage news ticker May 4 Blocks and Files – Blocks and Files
Apricorn has announced the release of the USB 10GbpsAegis NVX, the first Apricorn encrypted device to feature an NVMe SSD inside. The NVXs high-speed read/write capabilities at 1,000MBps are said to be sought after in the fields of military intelligence, digital forensics, film making, and healthcare, where write speeds over 600MBps are critical.Initial capacity offerings are 500GB, 1TB, and 2TB, with a price range of$352-$754 MSRP.
Couchbase has made its Capella Database-as-a-Service (DBaaS) available on Microsoft Azure Marketplace. It has also launched its Independent Software Vendor (ISV) Starter Factory, which provides ISVs with enhanced support for database management, migration, and cloud infrastructure. The new program is for those looking to develop and monetize applications using Couchbase Capella on AWS. It combines technical workshops with proof of concepts, training certifications, best practices, and aligning System Integrators (SIs) that are part of the AWS partner network. Acting as a one-stop shop, it empowers organizations to become developer-friendly while offering a low total cost of ownership.
Dell has announced the availability of Dynamic AppsON, which is the combination of VxRail dynamic nodes and PowerStore that delivers simplicity, scalability and ease of management, with advanced storage capabilities and data management features. This pairingcombines compute-only VxRail dynamic nodes with PowerStore in a tightly integrated solution that now extends Lifecycle Management to PowerStore via the VxRail HCI System Software to simplify operations. More info in a blog.
We heard about some layoffs at SaaS data protector Druva. Justin Augat, Drives SVP for portfolio management, said: We started a new fiscal year (April 1) and did a minor reduction based on business priorities and individual performance. We are still hiring and growing strong as a business.Mike Houghton was appointed SVP w-w Partners and Alliances on April 23.
The FCIA has a preview webcast on June 21 looking at 128Gbps Fibre Channel, 128GFC: A Preview of the New Fibre Channel Speed. The latest generation of Fibre Channel (128GFC) has a rate of 112.2Gbps (PAM4) for a single lane variant. This speed is 5.6 percent faster than 100Gbps Ethernet single lane variants. Fibre Channel was able to increase the speed and still maintain two generations of backward compatibility. Previous generation SFP optical modules (32GFC and 64GFC) will be able to plug into the latest generation of Fibre Channel 128GFC products.
128GFC products will support existing infrastructure of fiber cables for multi-mode variants and single mode variants. 128GFC is also able to support the previous reach of 100 meters of OM4 without sacrificing performance in link quality or increasing errors.
Giga Computing, a subsidiary of Gigabyte and producer of servers, server motherboards, and workstations, has announced continued collaboration with Graid Technology, and the creation of Gigabyte R Series servers to overcome NVMe bottlenecks in storage applications. The Gigabyte R283-S92 server became the first server to support the SupremeRAID SR-1010 RAID card. In addition, all future Gigabyte R283 and R183 servers will incorporate this optimization and become servers with the highest compatibility for Graid Technologys next-generation GPU-based RAID card.
HYCU has revealed a customer win the Boston Red Sox. The Red Sox are using HYCU Protg DPaaS platformas part of their IT modernization efforts, starting with more cost-efficient and effective backup to Wasabi as part of their overall cloud migration work. Its also going to be rolling out support for SaaS integrations for Atlassian and Okta as part of the recent R-Cloud introduction. The HYCU brand will be prominently displayed inside Fenway Park throughout the 2023 season.
Infinidat has been named a Customers Choice in the Gartner Peer Insights Voice of the Customer: Primary Storage for a fourth year, along with Huawei and Pure Storage. There are surprises in the chart, not least the relatively strong showing by Seagate (Exos SAN arrays), the presence of iXsystems and Synology, and the absence of Dell and IBM.
Get details here.
Pure Storage has announced a partnership between Portworx and MongoDB, which includes a first-of-its-kind integration between Portworx Data Services (the industrys first Database-Platform-as-a-Service) and MongoDB Enterprise Advanced. With this integration, Portworx Data Services offers a single pane of glass to deploy and manage MongoDB clusters in conjunction with the end customers technology stacks. Benefits were told include automated MongoDB lifecycle management, lower infrastructure costs and increased operational efficiency, and a single platform interface across hybrid environments.
Quantums recently released all-flash file and object storage software solution, Myriad, was recognized with three industry awards at the 2023 NAB Show: TV Tech Best of Show, TVBEurope Best of Show, and the NAB Product of the Year in the Cloud Computing and Storage Category.
Rockset reckons its product achieved up to 4x higher throughput and 2.5x lower latency than Elasticsearch for streaming data ingestion. It shares more in-depth results and details on the benchmark framework and configuration in a Rockset blog.
Satori has released its Universal Data Permissions Scanner (UDPS), a free open source tool that enables companies to understand which employees have access to what data, intended to reduce the risks associated with overprivileged or unauthorized users and streamlining compliance reporting. The answer to the question who has access to what data? is often unclear because of complex role hierarchies, different authorization models used for different technologies, and the variety of data that may be accessible across technologies and clouds.The sales pitch is that UDPS simplifies this complexity. Also, using the UDPS reduces the security risk from overprivileged users by ensuring that access controls are appropriately applied and revoked, were told.
When the US Department of Commerce imposed semiconductor restrictions on Chinese imports of equipment for processes of 18nm and below last October, SK hynixs Wuxi fab was granted a one-year production license. TrendForce reports SK hynix had planned to transition its Wuxi fabs mainstream process from 1Y nm to 1Z nm, decreasing the output of legacy processes. But because of limitations imposed by the US ban, its focusing on legacy DDR3 and DDR4 4Gb products. SK hynixs long-term strategy involves shifting its capacity expansion back to South Korea, while the Wuxi fab caters to domestic demand in China and the legacy-process consumer DRAM market.
Synlogy has launched theSA3410 and SA3610 NAS, 12-bay rackmount devices built for enterprise applicationswith a focus on scalability and speed for business file serving, VM storage, and petabyte-scalebackup applications. Key points include:
Swissbits U.3 SSD N4200 offers two to five times faster constant write performance than standard datacenter SSDs while maintaining low latency and high endurance over its lifetime, the company has claimed. Where conventional SSDs lose performance, endurance, and response time over time, Swissbit takes a novel approach with the N4200. The unique firmware, OEMd from Burlywood, enables the SSD to be optimized for specific application profiles and tuned according to the performance and endurance of a customer applications actual workload, were told. The 2.5-inch N4200 SSD series is based on enterprise-grade 3D-TLC-NAND (eTLC), is backward compatible with U.2, and features a four-lane PCIe interface compliant with PCIe 4.0 for up to 8,000MBps bandwidth in both directions. The SSD is available with storage capacities of 7.68TB and 15.4TB.
We have a chart from Burlywood which helps identify the mainstream SSD suppliers in Swissbits chart above:
See original here:
Storage news ticker May 4 Blocks and Files - Blocks and Files
Have ROT Data? Try These Tips To Clean Up and Prevent It – ITPro Today
High-quality data is critical in a companys every decision decisions that affect efficiency, productivity, customer satisfaction, and the bottom line. But you cant use good data for your decisions if you dont know where to find it.
Despite an organizations best intentions, data stores tend to overflow, go out of date, and even get forgotten aboutgiven enough time. Data can become such a mess that there is even an industry acronym for it: ROT (Redundant, Obsolete, and Trivial). Data becomes redundant when it has duplicates stored across different systems or locations; obsolete when an organization neither needs nor is required to retain the data anymore; and trivial when the data is simply extraneous and without value.
Related: 10 Tips for Developing a Data Governance Strategy
Clean, high-quality data serves as the bedrock for successful business operations and management. Obtaining high-quality data is a big undertaking and a critical first step and only the first step. Too often, businesses fail to maintain good data practices.Poor data practices, such as failing to regularly organize and cull data, will result in a mess ofROT dataover time.
Its gotten more complicated as businesses adopt more cloud-based resources, creating repositories of data in an exploding array of on-premises and cloud locations. The ecosystem of where data is being created, how its being stored, and how its being managed has exploded by 1000%, said Juan Tello, U.S. chief data officer at Deloitte Consulting. [Data] is now on-premises and in the cloud, with lots of fit-for-purpose solutions instead of a more monolithic approach.
Another reason for data disarray is the continued use of legacy systems, which often cant enforce modern data quality standards, Tello explained.
Having data, including ROT data,in so many different places not only results in chaos but can devour your budget.
If youre storing data on-premises and in the cloud, youre paying not only for storage in more than one place but making sure that data is backed up and replicated, said Rags Srinivasan, chief sustainability officer at Veritas Technologies.
According to a recent Virtana survey of IT leaders, 94% of IT leaders said cloud storage costs are rising. Fifty-four percent said storage spending is growing faster than overall cloud costs.
When you sign up for a cloud service, the tendency is to keep adding to it, but if youre deliberate about it, you can keep costs in line by keeping data better managed, Srinivasan noted.
Additionally, a habit of keeping old data around can lead to a regulatory compliance hazard. For example, financial services companies are required to keep transactional history for only as long as legally required, and most have strict policies to purge that data after that timeframe is up.
They definitely dont want an audit to come through and identify things they did badly in the past, said Andy Pernsteiner, field CTO at Vast Data. Keeping data longer than necessary can result in more liabilities, and companies dont want to take a chance of being held liable for holding onto an asset beyond the legal requirements.
ROT data can also take a toll on the environment. As companies look to become eco-friendly and sustainable, more want to reduce their carbon footprints. Data storage requires power and cooling, even if that data is stored in a cloud-owned data center. A 20TB hard drive, for example, uses about 14 watts of power, according to Srinivasan, and it will cost at least as much to cool the drive as it does to power it because of the heat it generates.
So how can companies ensure they manage data properly and retain only the data they need? These four tips will make a significant impact.
First, find out where the data is stored. That may be easier said than done since data can exist in many different locations, especially for companies that have existed for years. Structured data can reside in databases, data warehouses, data marts, and data lakes, while unstructured data can reside in file systems and object stores. Then there are physical devices, like user laptops, smartphones, and USB drives.
Once you identify the data locations, move the data to a data catalog, which helps identify and eliminate duplicate and irrelevant data. Data catalogs can also fix data that has errors so that it can become more usable.
Comprehensive data catalogs should have automated data lineage creation, data profiles, and policies for granular control and governance. The data profiling function examines, analyzes, and develops summaries of all data.
With a data catalog in place, decision-makers can gain visibility into which data is necessary and useful and which isnt. Organizations can also gain valuable insight into their incomplete data. For example, if you see that 10% of customer data lacks zip codes and another 5% lacks email addresses, you can then work to fill in those gaps.
In addition, data catalogs can show data attributes in new ways. For example, a data analyst could ask for all files or objects created by a specific department within a 2-year period, then filter that data to the files that have never been accessed. With these tools and processes, companies can more easily and safely move stale data offline or delete it altogether.
Its not enough to know that you have 500 petabytes of data sitting on all your systems, Pernsteiner said. You have to know how much of it is old, has GDPR-related information, or is data the organization doesnt care about anymore.
Knowing what you have, organizing it, and implementing policies will help to reduce data bloat and ensure compliance. However, if you fail to maintain your data management practices, you can land you right back where you started (or at least close to it).
As such, revisit your data policies regularly and revise them as necessary. For example, if a company has a hard-and-fast policy that requires purging data that hasnt been accessed or viewed after six months, that policy in time may become unrealistic or extreme. The company may decide to use a more dynamic policy that is based on current business needs.
If IT can prove that certain data has value to the business, there may be a value in keeping that data longer, and that might be a good reason to change the policies, Pernsteiner said.
About the author
See the original post:
Have ROT Data? Try These Tips To Clean Up and Prevent It - ITPro Today
IDrive Backup adds Unlimited Cloud to Cloud Backup, enabling … – StreetInsider.com
LOS ANGELES, May 4, 2023 /PRNewswire/ -- IDrive Backup, an industry leading cloud backup service, has announced the addition of unlimited cloud to cloud backup, giving users the opportunity to backup and restore all of their Microsoft Office 365 and Google Workspace data.
For organizations and personal users that rely heavily on those cloud services, it's extremely important to have a second copy of their data stored elsewhere with an established and reliable backup company such as IDrive, as no service is immune to system failure and restores can then become very difficult.
Cloud to cloud backup with the award-winning IDrive allows users to backup their data from one cloud service to the IDrive cloud, making it an ideal solution for those who want to protect their data from potential loss due to service outages or other issues. This provides an extra layer of protection for important data by backing it up to an entirely separate cloud, keeping that data secure, recoverable, and accessible.
Backing up Microsoft Office 365 data with IDrive allows users to backup their entire Microsoft Office Suite including OneDrive, SharePoint, Exchange, and Teams data, unlike other backup services who only protect a limited amount of this data. By using this single backup solution approach, users can eliminate the complexity and costs associated with maintaining multiple separate backup agents.
Features of Microsoft Office 365 Backup with IDrive include:
For Google Workspace data, including Google Drive, Gmail, Calendar, and Contacts, using IDrive to backup this data can be critical to help safeguard it against accidental deletion, ransomware, or other outages or issues.
IDrive's intuitive data retention system with automatic snapshot schedules ensures safe retention of multiple versions of the backed up data, with 3 automatic daily backups that are retained for 30 days and enabling users to perform point-in-time restores.
Users also have full comprehensive control over their backups through a centralized web console, helping them to migrate, export, download, and perform cross-user restores with ease.
IDrive Cloud to Cloud Backup is an available add-on for all plans including Mini, Personal, Business, Team, and Enterprise for $20/seat/year for unlimited storage. IDrive Cloud to Cloud backup is also available as a stand-alone plan for the same cost.
About IDrive
IDrive Inc. is a privately held company specializing in cloud storage, online backup, file sharing, remote access, compliance and related technologies. Core services include IDrive, IDrivee2, RemotePC and IBackup. The company's services help over 4 million customers back up over 500 Petabytes of data.
View original content:https://www.prnewswire.com/news-releases/idrive-backup-adds-unlimited-cloud-to-cloud-backup-enabling-users-to-safeguard-all-of-their-microsoft-office-365--google-workspace-data-301815492.html
SOURCE IDrive Inc.
See the original post here:
IDrive Backup adds Unlimited Cloud to Cloud Backup, enabling ... - StreetInsider.com
Apple results show Mac sales hit by economic slowdown – ComputerWeekly.com
Apple has posted quarterly revenue of $94.8bn, down 3% compared with last year.
Yet the companys services business posted an all-time record revenue of $20.9bn, according to CEO Tim Cook, and March was a quarter record for iPhone, despite the challenging macroeconomic environment.
We continue to invest for the long term and lead with our values, including making major progress toward building carbon-neutral products and supply chains by 2030, said Cook.
While the iPhone saw a 2% increase in revenue to $51.3bn, the companys iPad tablet experienced a decline in revenue of 13%, to $6.7bn. Its wearables business reported revenue of $8.8bn.
Although Apple remains consumer-focused, according to the transcript of the earnings call posted on Seeking Alpha, Cook has ambitions to grow its enterprise business. He said the company has focused primarily on the bring-your-own-device (BYOD) market, and as more companies offer this choice of device, Apple benefits. I think a lot of people want to use a Mac or an iPad at work, he said.
Chief financial officer Luca Maestri stated that Mac revenue for the quarter was $7.2bn, a 31% decline compared to the previous year. These results were driven by the challenging macroeconomic environment coupled with a difficult comparison against last years launch of the completely reimagined M1 MacBook Pro, said Maestri. Despite this, the installed base of active Macs reached an all-time high across all geographic segments and we continue to see strong upgraded activity to Apple silicon.
Apple is not alone in experiencing declining PC sales. According to analyst IDC, weak demand, excess inventory and a worsening macroeconomic climate were all contributing factors for a precipitous drop in shipments of traditional PCs during the first quarter of 2023.
Of the top five manufacturers of personal computers, Apple is fourth with 7.2% market share, behind Dell (16.7%), HP (12%) and Lenovo (12.7%). Its nearest rival in the PC market is Asus, according to IDC, which had a market share of 6.7% in the first quarter of 2023.
Looking beyond devices, Apple executives were asked if they had any plans relating to artificial intelligence (AI). Cook was not drawn on the companys product roadmap, but said: Weve obviously made enormous progress integrating AI and machine learning throughout our ecosystem.
When asked about Apple services, Maestri said the company had experienced strong adoption in its payments service, adding: Our growth rates are very strong as the adoption of Apple Pay and Apple Card continues to increase.
Cloud storage is an area that has continued to grow very consistently, according to Maestri. Users want to store more photos and videos and more content on their devices, and so they adopt our cloud services, he said. In general, the model in the App Store around paid subscriptions continues to grow very strongly. We now have more than 975 million paid subscriptions on the platform, and thats almost twice as much as we had only three years ago. So obviously, the growth in subscriptions is very strong.
Read the rest here:
Apple results show Mac sales hit by economic slowdown - ComputerWeekly.com
2023 Stock Market Gains Are Tech Driven: Take Profits Now and Move to Legacy Dividend Tech Stocks – 24/7 Wall St.
So far, this year has been a big improvement over 2022, and investors can thank big technology stocks for the huge turnaround. Up 16.30% through Tuesday, the Nasdaq has trounced the 1.65% gain of the Dow Jones industrials and the S&P 500s 7.73% move higher.
The problem is that only a handful of top technology leaders have driven all of the gains so far, and that could be dangerous, as most are at or close to 52-week or all-time highs. While the move higher has been solid, the reality is that Apple and Microsoft have generated nearly 50% of the S&P 500 gains so far this year. If you add in the rest of the FANNG stocks Facebook (Meta), Amazon, Netflix, Nvidia and Google (Alphabet) the group has accounted for a stunning 94% of the index gains through April.
One good idea for growth investors looking to stay in the technology sector might be to take profits on the high-flyers and look at some old-school legacy technology companies that may be able to pick up the rally torch in May. We screened our 24/7 Wall St. technology universe searching for industry leaders that may offer solid upside potential and reliable dividends. That combination could provide some powerful total return potential for the rest of 2023, and seven top stocks made the cut.
It is important to remember that no single analyst report should be used as a sole basis for any buying or selling decision.
Investors who are more conservative may want to consider this mega-cap tech leader, which recently posted outstanding quarterly results. Cisco Systems Inc. (NASDAQ: CSCO) designs, manufactures and sells internet protocol (IP) based networking products and services related to the communications and information technology industry worldwide.
Cisco provides switching products, including fixed-configuration and modular switches, and storage products that provide connectivity to end users, workstations, IP phones, wireless access points and servers, as well as next-generation network routing products that interconnect public and private wireline and mobile networks for mobile, data, voice and video applications.
Its cybersecurity products give clients the scope, scale and capabilities to keep up with the complexity and volume of threats. Putting security above everything helps corporations innovate while keeping their assets safe.
The networking giant posted solid fiscal first-quarter results that beat earnings expectations, and it offered up strong forward guidance.
Shareholders receive a 3.30% dividend. Credit Suisse has an Outperform rating on Cisco Systems stock. The firms $69 target price compares with a $56.72 consensus target and Wednesdays close at $45.96.
This company continues to be a huge player in the fiber optic world.Corning Inc. (NYSE: GLW) is a technology pioneer that manufactures LCD glass for flat-panel displays for multiple product lines.
Its Display Technologies segment offers glass substrates for flat panel displays, including liquid crystal displays and organic light-emitting diodes that are used in televisions, notebook computers, desktop monitors, tablets and handheld devices.Cornings Optical Communications segment provides optical fibers and cables, as well as hardware and equipment products, such as cable assemblies, fiber optic hardware and connectors, optical components and couplers, closures, network interface devices and other accessories for the telecommunications industry, businesses, governments and individuals.
The Specialty Materials segment manufactures products that offer material formulations for glass, glass ceramics, crystals, precision metrology instruments and software, as well as glass wafers and substrates, tinted sunglasses and radiation shielding products for various markets comprising mobile consumer electronics, semiconductor equipment optics and consumables, aerospace and defense optics, radiation shielding products, sunglasses and telecommunications components.
Its Environmental Technologies segment provides ceramic substrates and filter products for emissions control in mobile, gasoline and diesel applications. And the companys Life Sciences segment offers laboratory products, including consumables, such as plastic vessels, liquid handling plastics, specialty surfaces, cell culture media and serum, as well as general labware, glassware and equipment under the Corning, Falcon, Pyrex and Axygen brands.
Corning stock comes with a 3.45% dividend. J.P. Morgans Overweight rating is accompanied by a $42 price target. The consensus target is $37.17, and shares closed on Wednesday at $31.67.
This high-quality company pays a solid dividend, but its shares have been hit hard. Dell Technologies Inc. (NYSE: DELL) designs, develops, manufactures, markets, sells and supports information technology (IT) hardware, software and services solutions worldwide. It operates through three segments.
Infrastructure Solutions Group provides traditional and next-generation storage solutions, and rack, blade, tower and hyperscale servers. It also offers networking products and services that help its business customers to transform and modernize their infrastructure, mobilize and enrich end-user experiences and accelerate business applications and processes. It also offers attached software and peripherals, as well as support and deployment, configuration and extended warranty services.
The Client Solutions Group offers desktops, notebooks and workstations; displays and projectors; attached and third-party software and peripherals; as well as support and deployment, configuration and extended warranty services.
The VMware segment supports customers in the areas of hybrid and multi-cloud, modern applications, networking, security and digital workspaces, helping customers to manage IT resources across private clouds and complex multi-cloud, multi-device environments.
Dell also provides information security and cloud software and infrastructure-as-a-service solutions that enable customers to migrate, run, and manage mission-critical applications in cloud-based IT environments.
Investors receive a 3.38% dividend. Morgan Stanleys $55 price target comes with an Overweight rating. Dell Technologies has a consensus target of $47.86. Wednesdays close was at $45.16.
IBM
This blue chip giant still offers investors who are more conservative an incredibly solid entry point, a massive dividend and a degree of safety. International Business Machines Corp. (NYSE: IBM) provides integrated solutions and services worldwide through these four business segments.
The Software segment offers hybrid cloud platform and software solutions, such as Red Hat, an enterprise open-source solutions; software for business automation, AIOps and management, integration, and application servers; data and artificial intelligence solutions; and security software and services for threat, data and identity. This segment also provides transaction processing software that supports clients mission-critical and on-premise workloads in banking, airlines and retail industries.
The Consulting segment offers business transformation services, including strategy, business process design and operations, data and analytics, and system integration services; technology consulting services; and application and cloud platform services.
The Infrastructure segment provides on-premises and cloud-based server and storage solutions for its clients mission-critical and regulated workloads; and support services and solutions for hybrid cloud infrastructure, as well as remanufacturing and remarketing services for used equipment.
The Financing segment offers lease, installment payment, loan financing and short-term working capital financing services.
The dividend yield here is 5.31%. The $162 Credit Suisse price target on the Outperform-rated stock is a Wall Street high. The $141.47 consensus target is closer to Wednesdays closing print of $123.45.
This is another familiar name that could offer among the best in total return potential. Juniper Networks Inc. (NYSE: JNPR) designs, develops and sells network products and services worldwide. The company offers various routing products, such as ACX series universal access routers to deploy new high-bandwidth services; MX series Ethernet routers that function as a universal edge platform; PTX series packet transport routers; and NorthStar controllers.
Juniper Networks also provides switching products, including EX series Ethernet switches to address the access, aggregation and core layer switching requirements of micro branch, branch office, and campus environments; QFX series of core, spine and top-of-rack data center switches; and Juniper access points, which provide wireless access and performance.
In addition, the company offers security products including SRX series services gateways for the data center; Branch SRX family provides an integrated firewall and next-generation firewall; virtual firewall that delivers various features of physical firewalls; and advanced malware protection, a cloud-based service and Juniper ATP.
Juniper Networks stock investors receive a 3.02% dividend. J.P. Morgan has an Overweight rating with a $42 price target. The consensus target is $36.31. The shares closed at $28.94 on Wednesday.
Seagate
This disk drive giant looks reasonable at current trading levels.Seagate Technology Holdings PLC (NASDAQ: STX) provides data storage technology and solutions in Singapore, the United States, the Netherlands and elsewhere.
The company offers hard disk and solid-state drives, including serial advanced technology attachment, serial attached SCSI and non-volatile memory express products; solid-state hybrid drives; and storage subsystems. Its products are used in enterprise servers and storage systems and edge compute and non-compute applications.
Seagate also provides an enterprise data solutions portfolio, comprising storage subsystems and mass capacity optimized private cloud storage solutions for enterprises, cloud service providers and scale-out storage servers and original equipment manufacturers. In addition, it offers external storage solutions under the Seagate Backup Plus and Expansion product lines, as well as under the LaCie and Maxtor brands in capacities up to 16 terabytes.
Shareholders enjoy a 4.78% dividend. TD Cowen has an Outperform rating and has set its price objective at $70. The consensus target is $62.50, and Seagate Technology stock closed on Wednesday at $58.36.
This old-school semiconductor company offers solid value at current levels and is a great pick for investors who are more conservative. Texas Instruments Inc. (NASDAQ: TXN) is a broad-based supplier of semiconductor components, ranging from digital signal processors to high-performance analog components, to digital light-processing technology and calculators.
Some 65% of the companys sales are exposed to the well-diversified, business-to-business industrial, automotive, communications infrastructure and enterprise markets. While business from those sectors, especially automotive, could suffer in the near term, the analyst feels the solid dividend should support the shares.
The company is also a big Apple supplier, so the long-term outlook for this venerable leader makes it a safer bet for investors with less risk tolerance.
Investors receive a 2.98% dividend. The analysts at Oppenheimer have an Outperform rating for Texas Instruments stock. Their $195 price target is higher than the $181.46 consensus target and the most recent close at $163.35.
Get Our Free Investment Newsletter
Sponsored: Tips for Investing
A financial advisor can help you understand the advantages and disadvantages of investment properties. Finding a qualified financial advisor doesnt have to be hard. SmartAssets free tool matches you with up to three financial advisors who serve your area, and you can interview your advisor matches at no cost to decide which one is right for you. If youre ready to find an advisor who can help you achieve your financial goals, get started now.
Investing in real estate can diversify your portfolio. But expanding your horizons may add additional costs. If youre an investor looking to minimize expenses,consider checking out online brokerages. They often offer low investment fees, helping you maximize your profit.
Gartner: Begun the cloud file services wars have Blocks and Files – Blocks and Files
A report put out at the end of February titled Modernize Your File Storage and Data Services for the Hybrid Cloud Future by Gartner research VP Julia Palmer pulls a lot into focus about the cloud file services marketing war which B&F has written about in the past.
Palmer starts from more or less the same points: unstructured (file) data is growing and organizations are using the scalability of the public cloud for several things, bursting compute capacity, bursting storage, archival storage, remote worker access and so forth.
There are three groups of suppliers, she says:
A table lists representative suppliers in each category:
Next-generation file platforms have characteristics such as being designed to integrate and leverage public cloud IaaS. Vendors like NetApp and Qumulo have become software-defined and ported their filer operating systems to the public cloud, providing a file system and services abstraction layer covering the on-premises and public cloud worlds. VAST Data is heading the same way. Weka is there already.
At the other end of this spectrum hybrid cloud file data services are in the middle are the hybrid cloud file platforms. We have covered CTERA, Nasuni and Panzura often, also LucidLink and to an extent Vcinity, but not really included Zadara in this category. Nor have we looked at the public cloud suppliers themselves or Huawei or Oracle as being in this category.
The mid-spectrum players provide file data services, such as migration Atempo, Data Dobi, Data Dynamics, Peer Software and WANdisco, for example. Komprise also has a migration offering but is better known for its file lifecycle management capabilities, as is Nodeum. Hammerspace is pioneering its data orchestration technology.
Palmer really nails it with the a tripartite viewpoint, which is a neat way to group vendors and then to compare them in the three categories and map their capabilities to an organizations needs and direction.
It leaves us asking if the mid-spectrum hybrid cloud file data services players face having their capabilities provided as features by the next-generation file platforms on the one hand and the hybrid cloud file platforms on the other.
The Gartner paper doesnt look at object storage, which we think may be a limiting factor, especially as combined file and object offerings are becoming common. We can view S3 as AWSs hybrid cloud object platform and also look at Pure Storage, VAST Data and others as offering next-generation file+object platforms, while Cloudian and Scality layer file access on their object storage products, which already have public cloud support.
More here:
Gartner: Begun the cloud file services wars have Blocks and Files - Blocks and Files
IDrive gets even better with unlimited backups of your Google … – PCWorld
iDrive is trying something a little different in cloud backups: Backing up your entire Google Workspace or Microsoft 365 storage, for only $20 per year per computer.
Confused? Then lets, er, back up. IDrive is one of our most highly recommended cloud backup options, able to back up your PC to the cloud with either a free plan or paid options that go all the way to 20TB. Its basically one of the only good independent cloud storage/backup providers that isnt a major corporation like Google or Microsoft.
Due to be formally announced Thursday, IDrive will now offer unlimited cloud backup for both Google Workspace (Google Drive, Gmail, Calendar, and Contacts) and Office 365 (OneDrive, Exchange, SharePoint, and Teams), called Unlimited Cloud to Cloud Backup. What IDrive will do is back up all of your files in the Google or Microsoft cloud over to to IDrive as well. And if you pay for extra storage in either service, IDrive can accommodate all of those files.
Why do this? Its an additional layer of security for those who worry about cloud services going down and data being unable to accessed. It sounds a little far-fetched until you remember that Western Digitals cloud service was down for over a week. Thats why the rule of three exists: You should have your data stored in three places, just in case: a local copy and two backups. Thats what IDrive is offering.
IDrives Unlimited Cloud to Cloud Backup backs up data from the other clouds three times per day. It retains snapshots of previous versions, allowing you to restore your data from a given point in time and not just all of it at once. You can seek out and restore individual files and folders, too. IDrive considers this service to be an add-on, meaning that you can attach it to existing IDrive plans. (You can also buy Unlimited Cloud to Cloud Backup as a standalone service.)
IDrive seems to be pitching this as a solution for individuals, families or small businesses, since a demonstration video showed a backup of a Microsoft cloud with about twenty different users. The only catch, if there is one, is that IDrive charges the $20 annual fee (or roughly $1.66 per month) per computer for backing up one service, Google Drive or Microsoft Office 365.
Read more:
IDrive gets even better with unlimited backups of your Google ... - PCWorld
Quantum computing: What are the data storage challenges? – ComputerWeekly.com
Quantum computing will process massive amounts of information. Workloads could include diagnostic simulations and analysis at speeds far greater than existing computing. But, to be fully effective, quantum computing will need to access, analyse and store huge amounts of data.
There is an expectation that quantum computing will be the next step in the evolution of IT systems. Just as the multicore processor allowed computers to perform multiple tasks in parallel, quantum processors will be a leap forward in compute power and allow performance of complex tasks in a fraction of the time required now.
Quantum computers, as the name implies, use quantum mechanics the branch of physics concerned with atomic and subatomic particles to overcome the limitations of existing computer systems.
The principles of the superposition of states and quantum entanglement enable a different computation method from that used currently. A quantum computer can potentially store more states per unit of information called quantum bits, or qubits and operate with much more efficient algorithms at the numerical level.
Qubits are a two-state quantum-mechanical system. However, because of superposition, they can also be both of the two states 1 and 0 at the same time. In a classic computer system, a bit would have to be in one state or the other 1 or 0. Quantum mechanics allows a qubit to be in a coherent superposition of both states simultaneously, a property that is fundamental to quantum mechanics and therefore to quantum computing.
At the core of the quantum computers potential for achieving exponentially greater computational power lies the qubits capacity to exist in a state of superposition Martin Weides, Glasgow University
Ultimately, this will allow quantum computers to process complex tasks using large datasets far more quickly than a classic computer, especially in the realms of big data and pattern recognition. For example, quantum computers have potential applications in the pharmaceutical industry, to screen larger and more complex molecules than they were previously able to, and to map the complex interactions between a pharmaceutical product and its intended target.
At the core of the quantum computers potential for achieving exponentially greater computational power lies the qubits capacity to exist in a state of superposition, explains Martin Weides, professor of quantum technologies at Glasgow University. It gives you a statistical answer of likelihoods and then you repeat the calculation a number of times, and amplify that result. At the end, you get some outcome, but its not with 100% certainty.
One of the core challenges of quantum computers is that their storage systems are unsuitable for long-term storage due to quantum decoherence, the effect of which can build up over time. Decoherence occurs when quantum computing data is brought into existing data storage frameworks and causes qubits to lose their quantum status, resulting in corrupted data and data loss.
Quantum mechanical bits cant be stored for long times as they tend to decay and collapse after a while, says Weides. Depending on the technology used, they can collapse within seconds, but the best ones are in a minute. You dont really achieve 10 years of storage. Maybe in two decades we might get there, but its not required either.
Quantum computers will need data storage during computation, but that needs to be a quantum memory for storing super-positioned or entangled states, and storage durations are going to present a challenge.
So, its likely data storage for quantum computing will need to rely on conventional storage, such as in high-performance computing (HPC).
Considering the massive financial investment required for quantum computing, to introduce a limitation of cheap data storage elements as a cost-saving exercise would be counter-productive.
Given the data storage challenges and requirement to process large datasets, quantum computing is likely to be best accessed through the cloud. IBMs current quantum systems are cloud-connected, for example. Naturally, the effectiveness of cloud storage is dependent on network connectivity to the cloud.
Although quantum computing faces challenges in scalability and decoherence, its ability to perform multiple simultaneous operations in a fraction of the time it would take conventional processors means it is likely to become a powerful tool for analytics workloads.
Quantum computing and quantum storage are unlikely to replace existing compute and storage systems.
Using classical compute and storage infrastructure will remain the easiest and most economical solution for tackling everyday problems, especially those that involve small, simple, repeatable tasks.
That said, quantum computing promises to deliver incredible advances in fields that include materials science, climate research and pharmaceutical development. Organisations are already experimenting with quantum computing to develop lighter and more powerful batteries for electric cars, and to help create new medicines.
The limited storage capabilities associated with quantum computers means they will continue to be dependent on classical storage systems for data extraction and information output. However, these would have to be capable of handling large datasets. Some of todays high-end storage systems, especially those that are cloud-based, should be more than adequate for the task.
A quantum computer being so expensive would almost certainly be operated in a dedicated facility with lots of new hardware, including storage, concludes Weides.
Continue reading here:
Quantum computing: What are the data storage challenges? - ComputerWeekly.com
Cloudy with a Chance of Bad Logs: Cloud Platform Log … – Mandiant
More and more organizations utilize cloud technology for applications, file storage, and more. However, if an attacker compromises a cloud environment, organizations may not know how to investigate those technologies, or may not even be logging the evidence that could allow the organization to identify what an attacker did.
This blog post describes a hypothetical scenario of a cloud platform compromise with multiple components that would require investigation. Each component is an example of a real intrusion tactic that Mandiant has investigated across various cloud platforms, sometimes with logs available and sometimes without logs available.
For each part of the compromise, we provide recommended logging configurations and investigation processes organized into cloud technology themes that group cloud services from Google Cloud Platform (GCP), Amazon Web Services (AWS), and Microsoft Azure together:
After reading through this scenario, you should be able to:
While we review many concepts, there are some limitations to be aware of in the scope of this post:
The attacker gained access to the Cloud Email platform through a credential stuffing attack against a cloud administrator account. Once the attacker found a valid password, the attacker authenticated with those credentials and the Cloud Email platform asked them which type of multi-factor authentication (MFA) process they preferred. The attacker chose the push option, which sent an approval request to the legitimate user. The administrator user deals with push authentication requests throughout the day for various services and mistakenly accepted the authentication request, which provided initial access to the attacker.
Once the attacker identified the cloud administrator credentials and authenticated, they logged in to the Cloud Management Console to identify other applications that the user could access.
The attacker identified that the cloud administrator account had access to the Cloud Authentication Services application and authenticated to it. In the Cloud Authentication Services application, the attacker changed the privileges of the cloud administrator to the highest global administrator account privileges available and removed the multi-factor requirement.
While in the Cloud Management Console, the attacker identified that the organization uses a custom Cloud Application. The attacker accessed the Cloud Code Repository with the global administrator account and identified the Cloud Application source code hosted there. The attacker accessed the code and identified plain-text hard-coded credentials for an application service account.
While in the Cloud Authentication Services application, the attacker identified that the Administrator had access to the Cloud Logging platform. The attacker authenticated to the Cloud Logging platform and searched logs for keywords related to plain-text credentials. The attacker exported logs that contained those keywords, particularly database user credentials.
The attacker returned to the cloud Authentication Service application and performed reconnaissance on systems and users. The attacker exported all environment objects including systems and accounts.
Next, the attacker pivoted to the Cloud Virtual Machine infrastructure and created a templated virtual machine. The attacker assigned the virtual machine to the application service account previously identified in the application source code. The attacker configured the Cloud Networking rules to allow remote desktop protocol (RDP) access from the internet. The application service account did not require MFA for any authentication activity because of its intended use. The attacker logged on to the virtual machine through RDP from their command and control (C2) server.
While logged on to the newly created virtual machine, the attacker identified a database server based on the hostname SQLDB01. The attacker moved laterally from the virtual machine they created to the database server via RDP using the application service account.
The attacker connected to the database, which utilized a Cloud Database Service backend, using the database user credentials previously identified in logs and explored the data by enumerating the table schema and running select * queries.
While logged on to the attacker-created virtual machine, the attacker also performed internal reconnaissance to identify other systems of interest. The attacker scanned the network for other systems using custom port scanning utilities that searched for open SSH, RPD, and SMB ports.
The attacker identified a network-shared file server that hosted files on a Cloud File Storage solution. After enumerating files stored on the network share, the attacker copied files to their C2 system using a bulk network file transfer utility.
While accessing the file server, the attacker also decided to stage further backdoors in trojanized files that are likely to be opened by users.
While logged on to cloud email for the administrator account, the attacker browsed through the last several days of messages. The attacker looked at email folders named finance and hr and downloaded attachments from sent messages.
The attacker shared the uploaded trojanized backdoor file through the collaboration platforms file sharing service with 20 users.
Several users messaged the administrators account and asked questions about errors opening the new document they downloaded through the collaboration platform based on an automated file shared email link. The attacker replied to tell the users the document is legitimate.
Finally, in an attempt to delay detection, the attacker created a mailbox rule to automatically delete replies to the compromised file share email.
The aforementioned hypothetical scenario took place in a matter of several days, reflecting how quickly the threat actors moved in the real scenarios this one is based on. In these cases, information security teams commonly have only a few medium priority alerts fire that go unnoticed due to the abundance of alerts feeding from their tools.
In this scenario, suspicion started when several helpdesk team members realized they had separate reports of users who had suspicious files shared with them. The helpdesk team escalated to Information Security per their documented processes and the Incident Response (IR) team started an investigation into the cloud file sharing platform associated with the file sharing.
The IR team quickly realized that the default logging available with their lowest cost license subscription recorded many useful logs such as:
Unfortunately, the investigation could not answer the question did the attacker access any email messages or synchronize any mailboxes? due to the default logging levels. The IR team also realized they were lucky the incident was detected relatively quickly because the default license subscription only stored logs for 90 days with their Cloud Logging platform.
After a post-mortem review several months later, the organization realized the IR team only reviewed collaboration platform authentications and did not cross reference against domain authentication logs. This meant that the internal team never identified that the attacker compromised the cloud infrastructure platform and performed follow-on activities such as creating and accessing a VM, elevating to domain administrator privileges, and interacting with file servers. They focused only the collaboration platform because the initial incident identification occurred after the sharing of files on the Collaboration Cloud File Sharing platform. The investigation had to be reopened several months later when evidence had started to disappear from Cloud Logging sources.
As the scenario demonstrates, attackers have a wider surface area to persist and steal data because of the adoption of cloud infrastructure and collaboration platforms. The move to these cloud platforms brings useful functionality and security features, but configuring everything correctly can be overwhelming for a team that is new to the technology.
Not only are there many access, permission, and protection configurations to consider, but teams should also make sure that they would be able to fully investigate various attacks that could happen by storing the correct logs.
Understanding what technologies your organization uses and performing threat modeling is one way to make sure you have these logs and investigative processes set up should you need to investigate.
For details on how Mandiant can assist with your cloud security, please check out the following resources:
The following attack path diagram visualizes how the actor accessed a wide range of cloud platforms from outside a standard perimeter in this scenario. The actor also used cloud technologies to interact with systems in the non-cloud environment as well through connections and integrations.
The following checklist is designed to be copied or printed for your cloud infrastructure logging review efforts. The provided logs are example categories of commonly utilized event logs for forensic investigations.
Reference Number
Technology
Log Type
1.1.1
Cloud Virtual Machines
Configure system event logs to follow standard endpoint logging policies for authentication, user activity, and privileged account use.
1.1.2
Cloud Virtual Machines
Log virtual machine management actions such as Start, pause, backup, snapshot, create, delete, and command executions etc.
1.1.3
Cloud Virtual Machines
Forward system logs to a log management platform or SEIM as part of standard polices and processes.
1.2.1
Applications or Functions
Log web server access to application including source IP address, protocol used, request parameters, response status, user agent, referrer, and response size. Ensure that source IP address is not overwritten by proxy or load balancer technology.
1.2.2
Cloud Applications, Containers, and Functions
Log creation, modification, and access to application code.
1.2.3
Cloud Applications, Containers, and Functions
Record successful and failed authentication activity including source IP address.
1.2.4
Cloud Applications, Containers, and Functions
Log application user activity including user account, information viewed, actions performed, and sensitive data accessed.
1.2.5
Cloud Applications, Containers, and Functions
Forward system logs to a log management platform or SEIM as part of standard polices and processes.
1.3.1
Cloud Database Services
Log database user authentication and source network address.
1.3.2
Cloud Database Services
Log data access including source network address and user.
1.3.3
Cloud Database Services
Log data modification and deletion including source network address and user.
1.3.4
Cloud Database Services
Forward system logs to a log management platform or SEIM as part of standard polices and processes.
1.3.5
Cloud Database Services
Log errors and long running queries, which could be indicative of data transfer or reconnaissance.
1.4.1
Cloud File Storage
Log user authentication.
1.4.2
Cloud File Storage
Log file creation, modification, upload, and deletion events with user account, IP address, and timestamp.
1.4.3
Cloud File Storage
Log file download events with user account, source IP address, and timestamp
1.4.4
Cloud File Storage
Log location, folder, and file permission changes.
1.4.5
Cloud File Storage
Log API access to file storage locations, folders, and files.
1.4.6
Cloud File Storage
Log file and directory listing metadata view.
1.4.7
Cloud File Storage
Turn on alerts for suspicious activity, including malware and mass downloads, if available.
1.5.1
Cloud Authentication Services
Log user authentication with timestamp, username, and source IP address.
1.5.2
View post:
Cloudy with a Chance of Bad Logs: Cloud Platform Log ... - Mandiant
Google to build two new data centers in Ohio in addition to an … – The Statehouse News Bureau
Ohio weather might be cloudy much of the year but Google is going to put even more clouds here.
The tech giant is bringing two more data centers to Ohio to handle cloud storage, artificial intelligence and more. And these data centers will be in addition to the one thats already located near the future site of Intel in central Ohio.
Google Vice President of Government and Public Policy Mark Isakowitz said the companys existing data center near Columbus has generated $13.9 billion of economic activity for tens of thousands of businesses. And now, he said the company will build two more one in Lancaster and the other in Columbus.
We have data centers in 13 states and nine countries and we are going to have three operational data centers right here in central Ohio so thats really exciting," Isakowitz said.
Isakowitz, an Ohio native, said it's unusual to have three data centers located so close together. But he said Ohio has the water, land and natural resources the company needs. And he said Google likes the local business partners here in Ohio.
"So you feel you can build out a workforce, have enough space and room to develop the data centers. It's a combination of policies, people, natural resources that creates a good ecosystem," Isakowitz said.
The new data centers will bring the companys investment in Ohio to more than $2 billion dollars - all in the central Ohio region. Democratic Columbus Mayor Andrew Ginther said the area has the fastest growing economy in the Midwest.
"Google is playing a big part in making the region a hub for cloud computing and priming the region for even more development. The future is very bright in Columbus and Central Ohio," Ginther said.
Google isnt saying how many new jobs will be created. And it won't divulge other terms of the agreement. Republican Gov. Mike DeWine isn't divulging that information either. But he said more of this type of development will happen if the state invests in job training and education.
"We have to keep bringing these companies into the state of Ohio and the best way to do that, frankly, is to show them that Ohio is dead serious about investing in our people," DeWine said.
DeWine's proposed two-year state budget included millions of dollars for K-12 education. He had proposed $300 million for tech center improvements alone. He's been urging lawmakers to keep that money in there as the budget goes through the legislative process.
Ohio has been the site of big investments by tech companies recently. Last year, Intel announced it would be building a $20 billion dollar computer chip factory in New Albany, east of Columbus. And Honda and LG Energy Solution have teamed up to build a $3.5 billion dollar battery plant in Jeffersonville, about halfway between Columbus and Cincinnati.
Originally posted here:
Google to build two new data centers in Ohio in addition to an ... - The Statehouse News Bureau