Category Archives: Cloud Servers

Tencent Cloud and AMD Join Forces to Launch StarLake Servers in Southeast Asia – HPCwire

Singapore, Nov. 29, 2021 Tencent Cloud, the cloud business of global technology company Tencent, today announced its collaboration with leading semiconductor company AMD to introduce StarLake servers powered by AMD EPYC processors for hyperscale data centers in Southeast Asia. The collaboration further highlights Tencent Clouds commitment to providing safe, stable and high-performance infrastructure products and services for the cloud era.

StarLake is the first-ever self-developed server released by Tencent Cloud in 2019, and was amped up by the establishment of the StarLake lab in 2020. The lab focuses on creating a diversified technology ecosystem and high-quality products, allowing it to become the source of new technologies for corporate IT requirement. Also used in dozens of Tencents own business scenarios, Tencent Cloud has accumulated a substantial amount of experience in AMD platform adoption, with a large AMD platform deployment in China resulting to overall performance boost and cost-reduction.

The StarLake cloud servers were made available to offer better performance points for the virtual machines (VM) public cloud market, and is designed and ideal for running private cloud workloads. It also helps save power via efficient heat radiation, with a thermal resistance that is 35% lower than general and a radio of fan power consumption as low as 2.14%. Living up to Tencent Clouds safe, secure, stable and high-quality standards, StarLake is packed with key features such as:

Boosting StarLakes power, AMD has provided the high-performance, cost-competitive and energy-saving AMD EPYC processors for use. The highly acclaimed processors are known for its valuable way of maximizing performance for memory bound codes and increasing density of VMs per rack which can enable software savings.

Poshu Yeung, Senior Vice President, Tencent Cloud International, said, As a demonstration of Tencent Clouds dedication to always providing cloud products and services through highly compatible architecture as well as simple and reliable design, we are pleased to announce our technology collaboration with AMD, a multinational developer of computer processors and technologies. By working hand in hand, we can provide users with the StarLake server with higher performance, reasonable price and lower power.

We are pleased to be working with Tencent Cloud with the launch of their StarLake servers across Southeast Asia, saidPeter Chambers, Managing Director, APAC for AMD. Powered by AMD EPYC processors, Tencent Cloud helps to deliver a modern cloud environment to end users with leading performance capabilities, impressive price for performance, high levels of efficiency and advanced security features.

About Tencent Cloud

Tencent Cloud is Tencents cloud services brand, providing industry-leading cloud products and services to organizations and enterprises across the world. Leveraging its robust data center infrastructures around the world, Tencent integrates cloud computing, big data analytics, AI, Internet of Things, security and other advanced technologies with smart enterprise scenarios. At the same time, we provide a holistic smart enterprise solution for sectors including finance, education, healthcare, retail, industry, transport, energy and radio & television.

Source: Tencent Cloud

Visit link:
Tencent Cloud and AMD Join Forces to Launch StarLake Servers in Southeast Asia - HPCwire

A Climate Dystopia Displayed at the UMOCA with ‘the weight of a cloud’ – Daily Utah Chronicle

Walking into UMOCAs Exit Gallery, one wouldnt realize they are entering a sort of uncanny valley. The lush green landscapes and flowery imagery draws you in, but upon closer inspection, you see scenes of burning forests, nuclear towers and animal skulls. This eerie imagery has been referred to as environmental uncanny,bringing the haunting pieces from these artists to life.

Artist Nick Pedersens Slow Apocalypseexplores ideas of human-caused climate change through imagined, hyper-realistic digital landscapes. Each artwork examines the tension between the natural and human-made world, in densely packed scenes that blend the two together.

There is an almost sobering effect looking at these artworks. This deep melancholy cuts through even the most vibrant of colors and lushest greenery sorrowful eyes stare out at you as you watch homes burn as trees are cut down.

Issues surrounding climate change often feel far from home, removed from our day-to-day. Local artist Tiana Birrell identifies how climate change affects Utah with their latest project the weight of a cloud.

I am sure by now almost everyone has used the cloud before to store some part of their digital presence. Whether a Google Doc has autosaved your latest assignment, youve stored photos of a cool flower you saw outside or shot an email to a professor, weve all stored something in the digital cloud.

While the term cloud seems light, ephemeral and airy, Birrell draws our attention to the very real and physical nature of the cloud through this installation. The work shows stacks of data servers and miles of electrical wires that house the cloud, the things that tether it to the Earth despite what the name suggests.

In the Salt Lake Valley alone there are over 15 data centers, all requiring massive amounts of natural resources to operate. One center, where the NSA tethers their cloud, uses 1.7 million gallons of water daily to cool its servers. Birrell makes us question why these water-sucking panopticon make their homes in Utah the second driest state during one of the worst droughts its seen in decades.

I often find these kinds of exhibitions somber and they can leave me feeling defeated, but something about these installations at the UMOCA is different. There is something I cannot put into words just yet, something activating about the shows. The direct engagement between the exhibit and the viewer has an almost energizing quality, leaving us with the knowledge of the uncanny environment and how we can impact our landscapes.



See the article here:
A Climate Dystopia Displayed at the UMOCA with 'the weight of a cloud' - Daily Utah Chronicle

AWS Announced General Availability of Elastic Disaster Recovery –

Recently AWS announced the general availability (GA) of AWS Elastic Disaster Recovery (AWS DRS). With this new service, organizations can minimize downtime and data loss through the fast, reliable recovery of on-premises and cloud-based applications.

AWS RDS is a fully scalable, cost-effective disaster recovery service for physical, virtual, and cloud servers, based on CloudEndure Disaster Recovery and now advised by the company as the recommended service for disaster recovery to AWS.

Customers can use AWS as an elastic recovery site instead of investing in on-premises disaster recovery infrastructure that sits idle until needed. Once configured, DRS keeps the customers operating systems, applications, and databases in a continual replication state. This enables them to accomplish recovery point objectives (RPOs) of seconds and recovery time objectives (RTOs) of minutes after a disaster. DRS, for example, permits recovery to a previous point of time in the event of a ransomware attack.


Users can enable DRS using the AWS Elastic Disaster Recovery Console. In an AWS news blog, Steve Roberts, a developer advocate at AWS, explains that once the service is enabled, it continuously replicates block storage volumes from physical, virtual, or cloud-based servers - allowing support business RPOs measured in seconds. The recovery includes applications on AWS, including critical databases such as Oracle, MySQL, and SQL Server, and enterprise applications such as SAP, VMware vSphere, Microsoft Hyper-V, and cloud infrastructure. DRS orchestrates the recovery process for the servers on AWS to enable an RTO measured in minutes, allowing users to recover all their apps and databases that run on supported Windows and Linux operating systems.

The GA of AWS DRS follows the earlier release of AWS Application Migration Service (AWS MGN) as a successor to CloudEndure and the primary migration service for lift-and-shift migrations to AWS. In a Reddit thread, a respondent commented:

So, this is CloudEndure DR as if CloudEndure migration is to Application Migration Service.

And finally, note that AWSs biggest competitor in the cloud, Microsoft, also offers disaster recovery services with Azure Backup, Site Recovery, and Archive Storage.

AWS DRS is currently available in the US East (N. Virginia), US East (Ohio), US West (Oregon), Asia Pacific (Singapore), Asia Pacific (Sydney), Asia Pacific (Tokyo), Europe (Frankfurt), Europe (Ireland), and Europe (London) regions. Furthermore, additional details and guidance are available through the documentation page. Customers using the service are billed at an hourly rate per replicating source server, and more information on pricing details is on the pricing page.

Go here to read the rest:
AWS Announced General Availability of Elastic Disaster Recovery -

The Benefits of Using a Share File Server in Education – eLearningInside News – eLearningInside News

61% of businesses migrated their workloads to the cloud in 2020, and this trend is likely to rise in the next decade. One of the main areas the cloud is universally leveraged in is education, with schools and colleges using different resources to share and distribute educational material.

Some exclusively use educational cloud resources, like Microsoft Live for Education, while others are looking at more indie sources, like Centre Stack, for their file-sharing servers. If youre looking to add a share file server to your academic institution, consider the following.

Most students operate out of their laptops, which creates the assumption that sharing files within school grounds would be easy. However, problems start to appear anytime a teacher or student has to share documents that may or may not include private information, like an SSN.

Here are just a few problems that come with sharing files on an unsecured server:

To solve these problems, adopting a shared file server would be a great solution for schools.

A shared file server makes it easy to share folders and files internally with students, educators, and staff members. You can share files by uploading them to an online cloud storage server before sending them out, or you can share them directly over the Internet.

Most efficient share file servers will use a hybrid solution, meaning they use both a cloud and on-premise server to function. With an integrated system, no one has to learn how to reuse the program, which is great for busy students and educators who want to start sharing right away.

Educators who use shared file servers receive a whole host of benefits, including the following:

Keep in mind that most of these benefits come from hybrid shared file servers. The public cloud can be restrictive for most educational institutions, so always opt for a self-hosted infrastructure.

Educational institutions need to track each students performance and overall progress, meaning they must implement crucial metrics within the system.

For example, educators could assess student preparedness for exams based on average pages read, the number of research documents downloaded, and the types of books shared amongst peers. Share file servers have the flexibility to accommodate these metrics.

Southern Connecticut State University is using data centers collected from shared file servers to conduct longitudinal studies that span a students freshman orientation to graduation. With this information, they may be able to create a tailored learning experience for individual students.

The adoption of devices, like E-learning tools and apps, has made it possible for students to connect with each other, share ideas, and study for tests. However, data breaches are common.

The server and network youre using can make a major difference between whether your private information is hacked and shared or kept secret. Schools can leverage 24/7 surveillance, IT professionals, and redundancies in power outages to make their servers secure.

Featured Image: Pixabay, Pexels.

See the rest here:
The Benefits of Using a Share File Server in Education - eLearningInside News - eLearningInside News

5 questions for Mark Mills on the cloud revolution – Washington Examiner

After the 1918 Spanish Flu came the bustling era of the radio and the automobile known as the Roaring Twenties. Hoping history will repeat this post-pandemic boom, some have predicted a New Roaring Twenties. But what coming innovations or general purpose technology could spark such a period of progress and prosperity? My guest today thinks that technology has already been invented and you probably use it every day. On a recent episode of Political Economy, Mark Mills discussed the cloud revolution and its economic implications.

Mark is a senior fellow at the Manhattan Institute and a faculty fellow at Northwestern Universitys McCormick School of Engineering and Applied Science. His latest book is The Cloud Revolution: How the Convergence of New Technologies Will Unleash the Next Economic Boom and A Roaring 2020s .

Below is an abbreviated transcript of our conversation. You can read our full discussion here . You can also subscribe to my podcast on Apple Podcasts or Stitcher , or download the podcast on Ricochet .

Pethokoukis: What is the cloud? Is it just like a big computer or a big warehouse full of so many computers that it, itself, is a mega computer?

Mills: Well, thats a part of the cloud. I would say 90 percent of people in America today use the cloud close to daily. If you do finance on your phone or a computer, if you order with DoorDash, Airbnb, or you use Google Maps, all these functions connect through wired and wireless networks to massive, warehouse-scale computers, data centers, to do processing.

They dont just store information and calculate. They do inference things, not just calculations. So all those features together, all that physical infrastructure combined, is the cloud. But at the middle of the cloud sort of its beating heart, if youd like are these extraordinarily big warehouse-scale, Walmart-sized buildings that are jam-packed full of computers and servers and storage that dont just store information.

You describe these mega computers as physical infrastructure. How do they compare to more traditional categories of infrastructure in the scale of our investment?

The biggest new infrastructure that humanity has ever built is being built today, which is, loosely, the cloud. It is not an ethereal thing; its a physical infrastructure. By all measures dollars, physical equipment, square feet of buildings its the biggest infrastructure humanity has ever built. Its really quite remarkable. Its not just data centers. Were spending more on that now than we are spending on oil and gas infrastructure hardware. Were spending far more building out clouds that consume electricity than all the worlds utilities are spending annually to produce electricity and distribute it, so its already become a big consumer of capital.

One square meter of a typical cloud data center has about a thousand times more compute horsepower than the whole world had in the early 80s, and were building out data centers at the moment at the rate of about 10 million square feet a year. And data centers interestingly cost about the same to build as a skyscraper like the Empire State Building or the World Trade Center. Dollar per square foot, its about the same.

Is the cloud revolution another example of innovation in the world of bits that wont translate into real productivity?

So the things that the cloud has made better in our lives, are in the worlds of information, bits worlds, that are much easier to automate. Does that move the meter in the productivity of the economy? At the high level of economics, no, because the bits part of our economy (news, entertainment and finance) are important, but collectively, they are roughly 20 percent of GDP. So we have improved, with information, the information-centric parts of our economy.

If one looks across the physical domains of making stuff, what you see is that in every one of them, engineers are building pre-commercial or just-commercial capabilities that are as consequential to productivity as Airbnb and Uber are to those information-mediated domains. They are slower in being implemented because the regulations are properly more challenging. We can ironically accelerate that regulatory process soon because computers are on the cusp of being good enough to eliminate dangerous things easier in silico.

Is the cloud a general-purpose technology?

The electric motor is a general-purpose technology. An internal combustion engine is a general-purpose technology. Microprocessors are general-purpose technologies in the same way. Theyre the building blocks of the cloud, your smartphone, end-use devices, so thats the general-purpose technology.

The cloud is more like a utility infrastructure. The way electricity is, the way water is, with a distinction thats not trivial. It enables other general-purpose capabilities quite unlike electrification, because it invades all the others. Electricity has invaded a lot and we keep electrifying things, but its not that easy to electrify everything. It just isnt. Some things are better done mechanically or with combustion. Theres almost nothing that we do for which more information is not useful. In fact, in everything we do, everything in life, more information and knowledge has utility. So when I take a general-purpose engine like a microprocessor and distribute it to the scale we distribute it at, we really change the world in a way that no other general-purpose technology has.

So if we look back a decade from now, is your best guess that we will have indisputably seen a productivity acceleration where a key technology has been the cloud?

Yes. I think well look back and say the same kind of thing that people were starting to say when the internet took off early on. You and I both remember a lot of people were skeptical. And most people did not anticipate the scale of the expansion of the internet. In hindsight, Wow, its a big deal. It generated whole new industries, trillion-dollar industries that didnt exist before. Were going to see that times 10 in the next decade.

Ill end with Peter Druckers line, who was one of the great analysts of business and of human nature. He said that he only predicted whats already happened in forecasting, and by that, he meant he looked at patterns of things that were already underway that were firm and had high inertia and said, That will continue. And I think the pattern for a boom is already happening.

James Pethokoukis is the Dewitt Wallace Fellow at the American Enterprise Institute, where he writes and edits the AEIdeas blog and hosts a weekly podcast, Political Economy with James Pethokoukis .Mark Mills is a senior fellow at the Manhattan Institute and the author ofThe Cloud Revolution: How the Convergence of New Technologies Will Unleash the Next Economic Boom and A Roaring 2020s .

See the rest here:
5 questions for Mark Mills on the cloud revolution - Washington Examiner

Inspur Information Impresses in AI Performance with 7 Titles in MLPerf Training v1.1 – Business Wire

SAN JOSE, Calif.--(BUSINESS WIRE)--The open engineering consortium, MLCommons released its latest Training v1.1 results. Inspur Information submitted NF5488A5 and NF5688M6 server results for all 8 single-node closed division tasks, winning 7 of them.

MLPerf, established by MLCommons, is an AI performance benchmark that has become an important reference for customers purchasing AI solutions. For Training v1.1, 14 organizations participated. Results for 180 closed division and 6 open division tasks were submitted.

The closed division is very competitive because it requires the use of reference models to allow for an objective, apples-to-apples comparison of competing submissions. The benchmarks covered eight representative machine learning tasks, including Image classification (ResNet50), Medical Image Segmentation (U-Net 3D), and Natural Language Processing (BERT).

Inspurs NF5688M6 was the top performer in 4 tasks: Natural Language Processing (BERT), Object Detection Heavy-Weight (Mask R-CNN), Recommendation (DLRM), and Medical Image Segmentation (U-Net 3D). NF5488A5 was the top performer in 3 tasks: Image classification (ResNet50), Object Detection Light-Weight (SSD), and Speech Recognition (RNNT).

Inspur Single Node Title ListMLPerf v1.1 AI Training Results

AI Server




Inspur Information





Object DetectionHeavy-Weight

Mask R-CNN





Medical ImageSegmentation

U-Net 3D






Object DetectionLight-Weight






A full stack AI solution leads to AI training speed breakthroughs

Inspur AI servers led single node performance in the MLPerf v1.1 due to their unique software and hardware optimizations. Compared to Training v1.0, Inspurs AI training speed in Medical Image Segmentation, Speech Recognition, Recommendation, and Natural Language Processing increased by 18%, 14%, 11% and 8% respectively.

This enhanced speed allows Inspur AI servers to process 12,600 images in SSD, 8,000 pieces of speech in RNNT, or 27,400 images in ResNet50 tasks per second.

In ResNet50, Inspur optimized the pre-processing of images, used a DALI framework, and ran decoding on the GPU to prevent CPU bottlenecks. These continuing optimizations have allowed Inspur to top the ResNet50 rankings for last three benchmarks.

Inspur's leading performance in the MLPerf benchmarks is due to its superior system design and full-stack optimizations in AI. At the hardware level, the Inspur PCIe Retimer Free design allows for high-speed interconnection between the CPU and GPU to allow bottleneck-free IO transmission in AI training for improved efficiency. For high-load multi-GPU collaborative task scheduling, data transmission between NUMA nodes and GPUs is optimized and calibrated to ensure that data IO in training tasks is not blocked. Inspur uses A100-SXM-80GB (500W) GPUs, the highest powered GPU in the industry. This led to the development of an advanced cold plate liquid cooling system to ensure that these high-power GPUs can work stably at full power, guaranteeing full system performance.

MLPerf 2021 officially concludes with Inpur Information winning 44 titles

MLPerf Training v1.1 Results mark the conclusion of MLPerf 2021. As the winner of 44 titles in 2021, Inspur showcased its leading AI training and inference performance.

The 2021 Inspur MLPerf Performance Title list


Number of Titles


Data Center

17(out of 32)


16(out of 31)


Single node

11(out of 16)



Inspur AI servers NF5488A5, NF5688M6 and edge server NE5260M5 won 18, 15 and 11 titles respectively.

NF5488A5 is one of the first A100 servers to be launched. It supports 8x 3rd Gen NVlink A100 GPUs and 2 AMD Milan CPUs in 4U space that supports both liquid and air cooling technologies.

NF5688M6 is an AI server optimized for large-scale data centers with extreme scalability. It supports eight A100 GPUs, two Intel Ice Lake CPUs, and up to 13 PCIe Gen4 IO expansion cards.

NE5260M5 can be customized with various high-performance CPU and AI acceleration cards. The chassis depth of 430mm is half of a normal server. This combined with vibration and noise reduction optimizations and rigorous reliability testing makes it ideal for edge computing.

About Inspur Information

Inspur Information is a leading provider of data center infrastructure, cloud computing, and AI solutions, and is a top two server manufacturer worldwide. Through engineering and innovation, Inspur Information delivers cutting-edge computing hardware design and extensive product offerings to address important technology segments such as open computing, cloud data centers, AI, and deep learning. Performance-optimized and purpose-built, our world-class solutions empower customers to tackle real-world challenges and custom workloads. To learn more, please go to

Continue reading here:
Inspur Information Impresses in AI Performance with 7 Titles in MLPerf Training v1.1 - Business Wire

How Secure Is iMessage? | Leaked FBI Document Reveals the Truth – iDrop News

Despite Apples strong stance on privacy, weve always known the company has to walk a pretty fine line when it comes to locking things down too tightly, especially where dealing with law enforcement agencies is concerned.

Apple has always made a point that everything stored on your iPhone is quite securely encrypted, to the point that its basically impossible to get at it without knowing your passcode.

Except for targeted malware attacks, the only way to get into an iPhone is to brute-force the password, which basically means hooking it up to a specialized device that tries every possible combination of numbers and possibly letters until it finds the correct one. However, using longer alphanumeric passwords makes this effectively impossible to do in somebodys normal lifespan.

In fact, the iPhone is so secure that its put Apple at the centre of a pretty big controversy on whether big tech companies should be required to create a back door for government and law enforcement agencies.

U.S. lawmakers, in particular, have been taking steps to make end-to-end encryption illegal, raising the spectre of child exploitation as a bogeyman to justify their position the Apple should provide a master key for law enforcement to bypass encryption and easily perform a warranted search of any iPhone that comes under investigation. Its no wonder Apple has been trying to get ahead of the curve and appease lawmakers by finding a middle ground.

After all, if Apple doesnt tread carefully, it risks having all the privacy and security protections that its carefully built into iOS legislated out of existence by lawmakers under the guise of protecting kids.

This is undoubtedly also the thinking behind Apples new Communication Safety feature thats coming in iOS 15.2, particularly since the entire iMessage platform is already tightly end-to-end encrypted, not just on each users iPhone, but also as it travels through Apples cloud servers.

In other words, barring any industrial-strength spyware on your device, when you send an iMessage to somebody, theres no way for anybody to intercept or read that message apart from the intended recipient(s).

Unfortunately, as great as that sounds, there are a few other weak links in how the Messages app stores its data that could result in others getting access to your messages, and this is especially true for law enforcement agencies.

Apple has never made any secret that it will comply with any valid law enforcement request to provide whatever data it can, which generally includes everything in your iCloud Backup.

In fact, during a senate hearing two years ago, Apples head of user privacy, Erik Neuenschwander, shared that the company received 127,000 requests from law enforcement from 2012 to 2019, and in most cases, it responded to these within 20 minutes, usually by handing over all the pertinent data thats stored on its servers.

To be clear, Apple still cant open an iPhone. When senators accused Apple of blatantly refusing court orders to open an iPhone, Neuenschwander pointed out that no matter how much it may want to, Apple cant do what is essentially impossible, which includes breaking the strong encryption its created for the iPhone.

Many lawmakers and politicians refuse to buy into this particular point, however, maintaining that Apple should be required to re-engineer its devices so that this becomes possible.

Fortunately for user privacy, those wishes have yet to become enshrined in law, so for now, agencies such as the FBI will need to be content with whatever Apple can provide.

An internal FBI document recently obtained and shared by Property of the People (via AppleInsider) outlines how iMessage stacks up against other secure messaging systems from the perspective of the FBIs ability to legally access content and metadata from them. The document is unclassified but labeled as For Official Use Only (FOUO) and Law Enforcement Sensitive (LES).

While the document spells out what we already know, its an interesting inside look at where iMessage fits in alongside others such as Signal, Telegram, and WhatsApp.

In the case of iMessage, the key vulnerability is one that you should already be aware of, and it ultimately comes down to any data youve stored in your iCloud Backups.

Specifically, the document notes that the FBI can obtain Limited message content from iMessage. A subpoena can render basic subscriber information, and 25 days of iMessage lookups to and from a target number although a footnote explains that Apple includes a disclaimer that a log entry between parties does not indicate a conversation took place, and that these query logs have also contained errors.

On the other hand, a search warrant can render backups of a target device, and if target uses iCloud backup, the encryption keys should also be provided with content return that is, as part of the backup along with iMessages if target has enabled Messages in iCloud.

In laymans terms, this means that if youre using iCloud Backups, any Messages data from your iPhone is vulnerable to a search warrant or any hacker who gets access to your iCloud account. This can occur in two different ways:

In other words, if youre not using Messages in iCloud, then your messages are stored in your iCloud Backup in readable form.

If you are using Messages in iCloud, the key to decrypt them is stored in your iCloud Backup.

Either way, if youre using iCloud Backups, your iMessage history is vulnerable.

Fortunately, you can disable iCloud Backups and backup your iPhone or iPad directly to your computer instead. In this case, your Messages data is safe, since even if youre using Messages in iCloud, this data will be stored using end-to-end encryption, with the key nowhere to be found on Apples servers.

Of course, if youre not using Message in the Cloud, your messaging history wont be on Apples servers at all it will only be stored locally on your device and in your computer backups.

Note that even in this case, your actual iMessage conversations travel through Apples servers, and SMS conversations travel through your carriers network. While Apple cant provide the content of your messages, it may still be able to provide a log of who youve been communicating with.

Note that SMS text messages arent even that secure, and theres a good chance that your carrier can intercept everything going on through those channels.

Just keep in mind that all bets are off if youre using a company-provided iPhone, as there are numerous management tools that a corporate IT department can install to monitor your activity. In many jurisdictions, however, all communications that occur on company-owned hardware belong to the company, so you shouldnt have an expectation of privacy in those cases anyway.

The FBI document also provided details on what can be obtained from several other popular messaging systems, and many of these came out ahead of Apples iMessage.

For example, Signal, Telegram, Threema, Viber, WeChat, and Wickr were all listed as providing No Message Content. Line and WhatsApp provided Limited content, but only in specific cases.

WhatsApps users are vulnerable to the same loophole as iMessage users, with the FBI noting that If target is using an iPhone and iCloud Backups enabled, iCloud returns may contain WhatsApp data, to include message content.

Line, on the other hand, can maintain seven days worth of specified users text chats in the face of an effective warrant, but this is only possible when the user has not enabled end-to-end encryption.

Among the listed messaging apps, Signal was unsurprisingly the most private of the bunch, with the ability to provide only the date and time that a user registered for the service, and the last time they connected to it.

Telegram came in a close second, with a note that it may disclose IP addresses and phone numbers to relevant authorities for confirmed terrorist investigations, but it does so solely at its own discretion.

Lastly, WeChat may be a special case. While the FBI notes that it cant get any message content out of theChina-based chat service, thats probably not the case for Chinese authorities. In fact, the FBI notes that WeChat cannot provide records for accounts created in China, but will provide basic information such as name, phone number, email, and IP address for non-China accounts.

The same could be said for other messaging platforms owned by foreign companies, which might not be compelled to respond to U.S. law enforcement agencies, but could be required to do so for court orders from their own governments.

In most cases, these other messaging platforms maintain their security by avoiding iCloud Backups entirely. Developers can choose what data is stored in an iCloud Backup, and apps like Signal deliberately refuse to store anything at all, which is why you basically have to set it up from scratch when switching to a new iPhone.

After all, the best way to keep your data from falling into the wrong hands is to avoid keeping it in the first place.

Read this article:
How Secure Is iMessage? | Leaked FBI Document Reveals the Truth - iDrop News

Netweb Technologies Bags Award from MeitY in Contribution to the Manufacturing Sector – News Nation

Award Ceremony (Photo Credit: Netweb Technology Press Release )

New Delhi :


New Delhi/NCR based Server, Workstations and Storage manufacturer Netweb Technologies, a leading pioneer in the realm of technology solutions has been awarded the prestigious honor for its significant contribution to the manufacturing of servers in India. The award was presented by Ashwani Vaishnaw, Union Minister for Electronics and Information Technology at the event of the Ministry of Electronics and Information Technology Azadi Ka Amrit Mahotsav.

The event also featured Electronics & Information Technology and Skill Development and Entrepreneurship Rajeev Chandrashekhar, and several senior officials from the Central Ministries/Departments, states and UTs and stalwarts from industries. In continuation of the celebration of Azadi Ka Digital Mahotsav week from Nov 29 Dec 5, the third day was recognised and awarded the leading electronics companies contributing to Make in India.

With great expertise in high-performance computing, storage, cloud, artificial intelligence, and data centre, Netweb Technologies, the only Indian server manufacturer has received this prestigious award for getting these technologies developed locally which is helping in the technical enhancement and development of the entire ecosystem around it. Expressing his gratitude on receiving the award, Sanjay Lodha, Founder of Netweb on the felicitation says, We are delighted to receive this award from Meity for establishing servers spanning all over India.

This recognition and subsequent felicitation of our enterprise is going to fuel us to work even harder and we feel proud to be committed to embracing our Atmanirbhar Bharat or Self-reliant India momentum with our make in India products. We have always sought to lead the industry by focusing on operational excellence and overall client satisfaction.

Also Read This:Kejriwal gives fourth guarantee to Punjabis about education

Winning this award has been the result of an incredible effort put in by the entire Netweb team. Our company has continually demarked itself by the quality of services we provide, this award is the reflection of our commitment to revolutionize storage and computing in an unprecedented manner in the country.

With the covid induced need of hopping on cloud services, our aim is to help businesses with equipping them with the latest technology so that they can achieve their business goals and ROI. he added. Netweb has sales of its products and solutions in over 12 countries and has successfully deployed more than 500 HPC, over 7000 GPU installations and empowered the cloud market in 8 countries.

First Published : 03 Dec 2021, 05:58:34 PM

Read more here:
Netweb Technologies Bags Award from MeitY in Contribution to the Manufacturing Sector - News Nation

FTC is Suing NVIDIA to Stop Its $40B Acquisition of Arm Amidst Concerns of Potential Reliance from Rival Firms Should Deal Push Through – Tech Times

Urian B., Tech Times 02 December 2021, 05:12 pm

(Photo : Image from Nana Dua on Unsplash) FTC is Suing NVIDIA to Stop Its $40B Acquisition of Arm Amidst Concerns of Potential Reliance from Rival Firms Should Deal Push Through

The Federal Trade Commission or FTC is suing NVIDIA in hopes of blocking its $40 billion acquisition of Arm from SoftBank due to antitrust concerns. This deal has faced quite the amount of scrutiny from regulators ever since its announcement in 2020.

The United States' FTC action is the biggest hurdle that NVIDIA's deal has faced as of the moment, and it threatens whether the deal will be actualized in the future. As per the FTC announcement, the proposed vertical deal gives (NVIDIA) one of the largest chip companies control over the computing technology and designs that other rival firms heavily rely on in order for them to complete their own chips.

According to the story by CNBC, NVIDIA stock went up by 2% before the announcement but didn't budge significantly amidst the news. The FTC complaint has actually not yet been publicized.

Arm is known as a core supplier of architecture technology for a lot of semiconductor companies. Its own Arm instruction set is present at the core of almost all mobile processors that are powering smartphones like those made by Apple and Android devices using Qualcomm chips.

Arm's role in the chip industry was historically being a neutral supplier. With the new NVIDIA acquisition, the potential deal is raising concerns that the company could potentially decide to cut off competitors from purchasing essential Arm technology.

A number of NVIDIA's own processors are also using Arm-designed cores as well as Arm architecture. The company, however, is still best known for its GPUs that use a different type of architecture.

The FTC gave a statement saying the complaint alleges that a merger would provide NVIDIA with the ability and incentive to make use of its control of the much needed technology to undermine its competitors. To add, this could provide NVIDIA the power to reduce competition which would result in lower product quality, higher prices, reduced innovation, and fewer choices for consumers.

Read Also:Apple Up 3% Despite Fear of COVID-19 Omicron While Other Tech Giant Stocks Fall

The FTC noted that Arm's licenses, which also include competitors of NVIDIA, share very competitive information with the tech firm. With that, the FTC lawsuit is focusing mainly on chips for driver assistance, networking products, as well as Arm microchips that are being used for cloud servers.

The FTC announced that they have come up with a unanimous vote to issue the complaint against NVIDIA's $40 billion acquisition of Arm. The Arm deal has also been receiving scrutiny from overseas as well.

In fact, the FTC isn't the only party looking into the acquisition deal, as the European Commission announced that they would be launching an in-depth investigation directly into the deal in October. News of the acquisition came just a little after an article by TechTimes revealed that NVIDIA is planning to invest $6.9 billion up to 2023 to help fight the global chip shortage.

Related Article:GOG Made CD Projekt Lose $1.14M in Q3 2021 | 'Cyberpunk 2077' Still a Money Printer?

This article is owned by Tech Times

Written by Urian B.

2021 All rights reserved. Do not reproduce without permission.

See more here:
FTC is Suing NVIDIA to Stop Its $40B Acquisition of Arm Amidst Concerns of Potential Reliance from Rival Firms Should Deal Push Through - Tech Times

Securing the edge server infrastructure from the ground up – The Register

Paid Feature Edge computing has seen enterprise IT infrastructure escape from the confines of the traditional data center and put processing power closer to where the action is, or at least to where the data is generated. Among the reasons for such edge initiatives is to enable organizations to gain real-time actionable insights from the data.

But building out IT infrastructure at the network edge comes with its challenges. For instance, deploying systems outside the protective walls of a centralized data center can leave them exposed to theft and vandalism, not to mention tampering that could lead to the loss of sensitive data or the compromise of the entire corporate network.

The upshot is that systems for edge deployment must be first-class citizens when it comes to security, and should have the same level of security features as you would find in infrastructure inside a traditional data center.

Edge systems also need security to be built-in from the ground up, in other words, not added as an afterthought. Also, organizations need adaptive and flexible compute infrastructure to handle a diverse range of workloads, with enterprise edge cases including environments such as remote office/branch office, hospitality, logistics operations and retail outlets.

These are some of the considerations that Dell Technologies tackles with the latest additions to its Dell EMC PowerEdge server portfolio, designed for small and medium-sized businesses as well as enterprise customers.

These include four entry-level models, and one mid-range to high performance model: the PowerEdge T550 tower server; PowerEdge R250 rack server and PowerEdge R350 rack server; and the PowerEdge T150 tower server and PowerEdge T350 tower server.

The systems are designed as flexible and reliable building blocks for business-critical workloads, cloud infrastructure, and point of sale transactions. According to the firm, the new models incorporate a cyber-resilient architecture, starting at the hardware level with the silicon design and permeating the systems entire lifecycle, from manufacturing through the supply chain, right through to retirement of the hardware.

Perhaps the most notable new model is the PowerEdge T550, a flexible two-socket tower chassis server that, Dell Technologies says, balances expandability and performance. This system is based on the latest 3rd Gen Intel Xeon scalable processors, enabling it to run complex workloads using highly scalable memory, I/O and network options.

With support for up to 16 DDR4 DIMMs and up to 24 drives, the PowerEdge T550 is a substantial general-purpose platform capable of handling demanding workloads and applications, such as data warehousing, ecommerce, databases, and high-performance computing (HPC).

According to Dell Technologies, the PowerEdge T550 supports advanced technologies for enterprise-class workloads such as virtualization, medical imaging, data analytics, and software-defined storage. With 3rd Gen Intel Xeon scalable processors, the PowerEdge T550 can also be used for applications requiring AI acceleration thanks to Intel's Deep Learning Boost technology.

To ensure the security of edge deployments, Dell Technologies employs a multi-layered approach which starts at the hardware with an immutable Root of Trust. In PowerEdge servers, the Root of Trust is based on read-only public keys that at startup attest to the integrity of the system BIOS and the firmware for the Integrated Dell Remote Access Controller (iDRAC).

This enables an end-to-end verified boot, which means that at each stage of the boot cycle, each piece of code is verified by cryptographic signature. If some code fails the verification process, Dell provides the ability to revert to a known good image.

Protecting data is vital for any enterprise, and this goes doubly so in a perhaps vulnerable edge deployment. For this reason, Dell Technologies supports self-encrypting drives (SEDs) in its new PowerEdge servers, with the keys for accessing the drives stored in the PowerEdge Raid Controller (PERC). If a drive is stolen, the data is inaccessible without the key stored in the PERC.

Dell Technologies also provides for a higher-level security management of the keys necessary for accessing the encrypted drives. Secure Enterprise Key Manager (SEKM) implements key management server (KMS) to store keys centrally. It distributes these to the PERC through the iDRAC in each server to unlock access to the servers storage devices at boot time. This arrangement ensures that even if an entire server is removed from an edge data center or enclosure, the data stored on it remains encrypted and inaccessible without access to the central KMS keys.

The latest PowerEdge systems protect against malicious code that attempts to target the memory space of running applications, courtesy of Software Guard Extensions (SGX) found in newer Intel Xeon processors. This capability enables secure enclaves to be created in memory for sensitive processes, which only that process can access. The 3rd Gen Intel Xeon scalable processors in the PowerEdge T550 is Intels first mainstream two-socket processor to feature SGX across all SKUs.

As recent supply chain attacks have shown, it is possible to compromise a system at any point in the chain. For example, a server could be infected with malware for later exploitation before it even reaches the customer. To tackle this issue, Dell has introduced Secured Component Verification (SCV), a supply chain assurance scheme to verify that the system that arrives at the customer site is the same as was built in the factory.

This is achieved by generating a certificate from the unique component IDs during the factory assembly process, which is signed in the Dell factory and stored in the servers iDRAC. The customer can use SCV to validate the system inventory against the SCV certificate, as any swapping or removal of the components from which the certificate was generated at the factory would be identified as a mismatch.

The cyber-resilient architecture of Dell EMC PowerEdge systems supports a secure server lifecycle. This begins with secure provisioning and ensuring that any images loaded on to the server are secure, signed and verified.

In some PowerEdge models, Dell supports live scanning of the system BIOS, which makes it possible to verify the integrity and authenticity of the BIOS image in the primary ROM not just at boot up but also whilst the host is powered on and running. This scan is scheduled through the iDRAC.

The latest generation of PowerEdge servers can also securely control a servers configuration after it is provisioned. System Lockdown mode prevents users without system privileges from making changes to the configuration or firmware so protecting the system from unintentional or malicious changes.

Security is not something that should be tacked on to servers on an as-you-go basis

Dell Technologies has supported digital signatures on firmware updates for several generations of PowerEdge servers. This feature assures that only authentic firmware is running on the server platform. Dell digitally signs all firmware packages, and the iDRAC scans and compares their signatures with what is expected using the silicon-based Root of Trust. Any firmware package that fails validation is aborted and an error message is logged.

At the end of the system life cycle, Dells PowerEdge portfolio includes Secure Erase to remove sensitive data and settings. Customers can wipe storage devices and non-volatile stores such as caches and logs in systems, so that no information is unintentionally exposed after disposal.

The ability to remotely manage systems without an engineer having to physically attend the site is a prerequisite for edge deployments. This is a core capability of the Dell EMC OpenManage Enterprise management platform, which allows IT staff to discover, deploy, update and monitor PowerEdge servers.

For example, OpenManage Enterprise working with iDRAC enables an organization to detect any drift from a user-defined configuration template, and fix the issue.

To conclude, security is not something that should be tacked on to servers on an as-you-go basis. It must be built into server hardware from the outset. This is just as important for edge deployments as the data center. With a secure server infrastructure in place, IT teams can spend less time reacting to security issues, thereby improving their productivity.

Dells latest PowerEdge server systems show the way with security embedded in the hardware and a secure lifecycle that extends from the factory right through to retirement of the hardware by the customer, to ensure that systems and the data they contain stay as secure as possible.

Sponsored by Dell.

Read more:
Securing the edge server infrastructure from the ground up - The Register