Dow Jones in the Spotlight, Bonds Stabilize, Tech Plays Catch-Up

KEY

TAKEAWAYS

  • The Dow Jones Industrial Average closed at a new all-time high.
  • Stock market indexes still have bullish momentum in spite of up and down movement.
  • Bond prices could stabilize after digesting the interest rate cut.

The Federal Reserve’s interest rate cut decision on Wednesday was like receiving a gift from a wish list. When the rate cut was announced, the market initially rose, acting surprised by the decision. But the excitement fizzled off as the market closed lower on that day. The next day, buyers were back, but Friday’s action had more selling than buying. You have to cut it some slack, though, given it was triple witching Friday—the expiration of stock options, index options, and index futures. It’s not unusual to see elevated trading activity as traders work on unloading positions or rolling them out to a future date.

In spite of the stock market’s up and down movement, the broader market indexes didn’t take too much of a hit. The S&P 500 ($SPX) and Nasdaq Composite ($COMPQ) closed just a hair lower, while the Dow Jones Industrial Average ($INDU) closed slightly higher, notching an all-time record close.

Let’s unpack the charts of the broader indexes, starting with the S&P 500.

S&P 500 Breaks Above Resistance

The large-cap S&P 500 index broke above the resistance of its slightly downward-sloping trendline. The daily chart below shows that market breadth in equities is improving. Note that three market breadth indicators are displayed in the lower panels below the price chart.

CHART 1. DAILY CHART OF THE S&P 500. The large-cap index still has momentum with market breadth indicators confirming bullish strength.Chart source: StockCharts.com. For educational purposes.

The S&P 500 Bullish Percent Index ($BPSPX) is at 77, the NYSE Advance-Decline Line is rising, and the percent of S&P 500 stocks trading above their 200-day moving average is at 76.60. All three indicators confirm bullish momentum in the S&P 500.

The Nasdaq Composite

The Tech-heavy Nasdaq ($COMPQ) has also broken above the resistance of its downtrend line, but, unlike the S&P 500, it didn’t close at a new all-time high this week. Its market breadth isn’t as strong as that of the S&P 500, as is visible in the market breadth indicators in the lower panels.

CHART 2. DAILY CHART OF NASDAQ COMPOSITE. The Nasdaq is trading around its August high. If it breaks above that level and market breadth continues to expand, it would confirm a bullish move.Chart source: StockCharts.com. For educational purposes.

The BPI for the Nasdaq is at 54.85, which is slightly bullish. The percentage of Nasdaq stocks that are trading above their 200-day moving average is at 44.23, while the Nasdaq Advance-Decline Line is rising. So overall market breadth for the Nasdaq doesn’t confirm an uptrend as strongly as one in the S&P 500.

The Nasdaq Composite is trading close to its August high. A break above this would confirm a bullish move, so it’s worth adding this chart to your ChartLists.

The Dow Jones Industrial Average

The granddaddy of the indexes has been marching higher closing at a new all-time high (see chart below). After pulling back in early September, the Dow has taken the lead.

CHART 3. DAILY CHART OF DOW JONES INDUSTRIAL AVERAGE. The index closed at a record high and market breadth indicators point to strong bullish pressure.Chart source: StockCharts.com. For educational purposes.

The DJIA BPI is above 80 and trending higher, the percentage of Dow stocks trading above their 200-day moving average is relatively flat, and the Dow Advance-Decline line continues to rise higher. All three breadth indicators confirm the Dow is bullish.

The takeaway: The three broad indexes are up for the month. There’s a week and a day remaining this month. Will this September buck the seasonality trend?

Bonds, Gold, Oil

Bond prices have fallen since the Fed’s decision, possibly because the stock market is still coming to grips with the news. The chart of the iShares 20+ Year Treasury Bond (TLT) below shows that TLT is close to a support level.

CHART 4. BOND PRICES FALL BUT COULD FIND STABILITY SOON. Watch bond prices at the nearest support level.Chart source: StockCharts.com. For educational purposes.

If it stabilizes at this level and turns higher, it could present an opportunity to allocate a portion of your portfolio to bonds.  

Meanwhile, commodities are showing upside price movement. Gold prices continue to rise, closing at an all-time high on Friday. Oil prices are off their lows, although they are still in a downtrend. The Energy Select Sector SPDR Fund (XLE) is at its 200-day moving average. Let’s see if it breaks above it next week. Energy was the leading sector for the week. And don’t ignore Utilities; the sector was the leading sector on Friday and could be poised for more upside movement.

In the Tech Front…

The week ended on interesting news. Talks of a Qualcomm (QCOM) takeover of Intel (INTC) surfaced on Friday. Shares of INTC traded higher on the news. This could impact chip stocks, which have had a rough ride of late. Another chip company we’ll hear about next week is Micron Technology (MU), which reports earnings next week. The rumor is that there may be some negative news. Micron has taken a beating since June, and technically, the chart looks ugly.

End of Week Wrap Up

  • S&P 500 closed up 1.36% for the week, at 5702.55, Dow Jones Industrial Average up 1.62% for the week at 31,063.36; Nasdaq Composite closed up 1.49% for the week at 17948.32
  • $VIX down 2.48% for the week closing at 16.15
  • Best performing sector for the week: Energy
  • Worst performing sector for the week: Real Estate
  • Top 5 Large Cap SCTR stocks: Insmed Inc. (INSM); Carvana (CVNA); Applovin Corp (APP); Cava Group (CAVA); FTAI Aviation Ltd. (FTAI)

On the Radar Next Week

  • August New Home Sales
  • Q2 GDP Growth Rate
  • August Durable Goods Orders
  • Speeches from Chairman Powell and other Fed officials
  • August Personal Consumption Expenditure (PCE)
  • Micron (MU) Earnings

Disclaimer: This blog is for educational purposes only and should not be construed as financial advice. The ideas and strategies should never be used without first assessing your own personal and financial situation, or without consulting a financial professional.

Jayanthi Gopalakrishnan

About the author:
Jayanthi Gopalakrishnan is Director of Site Content at StockCharts.com. She spends her time coming up with content strategies, delivering content to educate traders and investors, and finding ways to make technical analysis fun. Jayanthi was Managing Editor at T3 Custom, a content marketing agency for financial brands. Prior to that, she was Managing Editor of Technical Analysis of Stocks & Commodities magazine for 15+ years.
Learn More

Source link

#Dow #Jones #Spotlight #Bonds #Stabilize #Tech #Plays #CatchUp

With Snapdragon X Elite, Qualcomm Looks to Break Intel, AMD’s PC Hegemony

Microsoft introduced its new line of artificial intelligence (AI) PCs last month, announcing Copilot+ Windows laptops packing new silicon, that can match up to, and even surpass, the performance of a MacBook running on Apple’s in-house M series chips. The new thin and light Copilot+ laptops, from both Microsoft and its OEM partners Acer, ASUS, Dell, HP, Lenovo, and Samsung, promise all-day battery life, AI features like Recall, Live Captions, Co-creator and more.

A host of these Copilot+ laptops are powered by Qualcomm’s Arm-based Snapdragon X Elite platform, announced at the Snapdragon Summit in October 2023. The American chipmaker, a major player in the smartphone SoC market, now hopes to take on chip giants Intel and AMD, and Apple’s in-house silicon, to make its mark in the PC segment. Microsoft is reportedly confident that the Snapdragon X Elite-powered Windows laptop can finally beat Apple’s MacBook models. In fact, as per internal testing from Qualcomm, the X Elite is said to be 28 percent faster than Apple’s M3 chipset.

At the eve of Computex Taipei last week, Qualcomm president and CEO Cristiano Amon took the stage and highlighted the capabilities of the new line of Copilot+ PCs. “The PC is reborn,” Amon said in his keynote, calling X Elite-powered Copilot+ PCs “the fastest, most intelligent Windows PCs ever built.” At Computex, the new laptops running on Qualcomm silicon were on display, showcasing AI features like Live Captions and Recall. Gadgets 360 got a chance to interact with X Elite-powered Microsoft Surface laptops at a Qualcomm demo room and try out some of the features first-hand.

Microsoft Surface Copilot+ laptops at Qualcomm’s demo room at Computex Taipei

At the sidelines of the tech trade show in Taipei, we also caught up with Nitin Kumar, vice-president of product management for Snapdragon chipsets at Qualcomm. In our chat, we covered a lot of ground — from the company’s partnership with Microsoft, to new Copilot+ features on the X Elite chip. We also touched upon the expectations from the new platform in terms of gaming performance and the path ahead for AI-powered laptops. Here are slightly edited and condensed excerpts from our chat:

Gadgets 360: Can you talk about the journey that has led to the X Elite chip launch and Qualcomm coming back into the PC game to position itself against traditional heavyweights like Intel and AMD, making it more of a three-horse race?

Nitin Kumar: Absolutely. I’ll actually take a step back and walk you through how we have envisioned this, why we are in this business and what we are trying to achieve. Of course, Qualcomm has a long history and legacy in smartphones, and India is a very predominant market for us; Snapdragon is a very well-known brand name for India market for smartphones.

If you look at the last decade, Qualcomm, has invested predominantly in smartphones in multiple technologies. Every year with every generation, we have driven the technology forward, we’ve driven the experience forward. When you draw a parallel to that with the PC industry, to be very honest, it hasn’t seen any significant innovation. When you look at the same, let’s say, last ten years or beyond, be it in terms of improvement or user experience or feature set or even battery life for that matter, things haven’t improved significantly.

So, we have the technological know-how and we wanted to bring that level of innovation onto the PC industry to provide that disruption. So that when you innovate or when you do more tasks or whatever you want to do with your PC, you get that better capability. So, our journey has been from that vision, that we have our core strength in leading technology, absolutely leading battery life — that’s our DNA — and we want to leverage our core strength, our DNA onto the PC industry and along with partnering with Microsoft with full Windows experience natively available on Arm architecture-based Snapdragon processor — we’ve had two or three generations of product in the past — but this one is absolutely foundational, absolutely unique, absolutely disruptive with the level of technology that we have packed in the X series processors. We’re very proud of that. That’s what the journey has been and that’s our vision: to be disruptive in the marketplace where the PC industry hasn’t been.

intel computex Intel

Intel unveiled its Lunar Lake laptop chip at Computex Taipei

Gadgets 360: From the consumer perspective, it feels like, in the past year, the pace of innovation, especially on the chipset side, has been pretty quick. We’ve already seen multiple iterations of advanced AI capable chips coming out. If a consumer wants a laptop with AI-ready features and they look at options running on AMD chips, or Intel’s (upcoming) Lunar Lake chips, and now there’s one that are running on Snapdragon. How does the consumer decide what to get? PC consumers know that Intel and AMD have been in the PC market for long, and Snapdragon is now coming in (with X Elite). Reported benchmark numbers say that it’s as powerful or more powerful than the M3 chip from Apple. But how do you make the consumer make that switch in their heads?

Nitin Kumar: Great question. And we think about that a lot, and I completely understand the crowded aspect of the messaging and the crowded aspect of the marketplace. Let me first give you a few data points on how we are looking at it and what we are doing as well. You’re correct, our competition in the X86 world has had a significantly longer legacy in terms of PC space. So, we completely understand that, and they have a presence associated with that. But at the same time, if you look at what Snapdragon is known for, especially in India market — for premium Android experience, you want that mobile device experience, you want the best experience on an Android phone that you can get, you got to go Snapdragon.

So, there is a brand equity that we have with Snapdragon. And we are bringing that same brand promise, that we have worked hard over generations to create with our mobile phone, onto a PC, as well. So, when as a consumer you have the absolute leading-edge Android smartphone powered by Snapdragon 800 series, 8 Gen 2, 8 Gen 3 platform, what you get is premium experience, premium battery life. So, for the consumer, it should be a seamless transition to look at. ‘OK, I go with Snapdragon X Elite, I’m bound to get the premium experience and premium battery life.’ That gives us a big, big edge over there.

And even with everybody talking about AI capability, I think there needs to be a second-level filter on what that really means. And how do you decipher the two out. Not all NPUs are created equal, if you will. What truly matters? Of course, one is just peak performance: Okay, I have 40 TOPS (trillions of operations per second) NPU or 45 tops NPU. But what truly matters is to think of it from a metric, which is NPU performance per power consumption, or NPU performance per Watt. What all I can load up on my NPU and get that task done as quickly or burning the least amount of power.

That’s where the efficiency, the mobility aspect will come in, and that’s where the Snapdragon promise, which comes from the mobile space, will tie into PC and help us differentiate in the market. We are very confident in our approach, in the promise that we are delivering, and we have made our comparisons. We’re very confident that these devices will be the absolute best Windows 11 Copilot+ devices in the marketplace. And with the strength of our partners, ecosystem and partnership with Microsoft, I think we will disrupt the market.

x elite 1 x elite

The Snapdragon X Elite was announced at Snapdragon Summit in October last year

Gadgets 360: We’ve also heard from Qualcomm that 1,200 games have been tested on the X Elite chip, but performance numbers have not been made public yet. And I’m not going to get into that. But what can gamers expect from Qualcomm?

Nitin Kumar: Very good point. It’s definitely in our interest to set the right expectation in the marketplace on what these systems are capable of. In the initial phase, we have been focused predominantly on consumer and then commercial space. To be very honest, gaming as a category is not our initial focus and the initial set of devices (on Snapdragon silicon) are not going to compete with the gaming category of devices.

Having said that, there is a class of gamers that are casual gamers, not necessarily hard-core gamers. But when you look at casual gaming, the performance the system is driving, it’s actually meeting or beating any other integrated GPU class of product for a casual gaming experience that is available today. And more importantly, you are able to achieve that same level of graphics and gaming capability at half the power. So, think from a casual gamer perspective, this will drive a better experience, not from a performance perspective, but battery-life perspective, and overall, you get a better system.

Gadgets 360: We tried out some of the AI features on Snapdragon-powered Microsoft Surface Copilot+ laptops. Some of them weren’t as accurate when it came to localised prompts. But Live Captions worked well with Hindi. How do you plan to localise these AI features?

Nitin Kumar: Qualcomm has been integrating an AI engine on our mobile platforms for almost a decade now. So, the class of leading-edge processors that we’ve had for our smartphone lineup, they’ve all had an AI engine of some sorts, half a TOPS, one TOPS — doesn’t matter. But we have been innovating on AI for long. The PC industry has just started to talk about it. So, think of all the AI models, AI work, AI optimisation that we have worked on the mobile front, we’re bringing all that out onto our PC platform as well — all that has been optimised, localised, and regionalised for the different regions.

So live translation has a very popular use case on smartphone model. All those AI use cases have been optimised on Snapdragon architecture. The fundamental architecture on smartphone and PC is exactly the same, so you can take that model and run it on a PC, and it will have a phenomenal result because it’s already so finely, optimally tuned for Snapdragon architecture.

Now the question is, what do you do from a Qualcomm perspective to enable the developers to optimise these new models for the PC industry? The PCs may require a different set of models, different AI use cases, and how do you enable that? For the last 10 years, we’ve been working on this AI journey on smartphone that we’re bringing to PC, so we have a whole set of AI tool sets, AI SDKs, AI stack models that we’ve made available to all the developers. So, we give them all the documentation and tool sets, and you as a developer can take that tool set, start writing your code. The tool set will give you all the changes required to make the model fully optimised on Snapdragon architecture. We have built that over a long time.

On top of that, at Microsoft Build, about two weeks back, we launched something called AI Hub for compute. It takes the ability of a developer to write their AI application on Snapdragon to the next level and makes it super convenient for them. You log in to AI Hub for compute, and it has about 70 or 100 models that are already optimised — stable diffusion is one model, Llama 2 is another, Hugging Face models are already there, that are already optimised for Snapdragon architecture.

What you can also do is, if let’s say you are a developer and you want to optimise for Snapdragon PC, you can create your model, upload it onto that portal, and it will run that model on a physical Snapdragon X Elite PC, and give you back the results to tell you, ‘hey, this is how your runtime was, this is where the optimisations can be, this is how you can debug it’ — all remotely. You don’t even need to have a Snapdragon X Elite as a developer device available with you. You can do it all remotely. You can build your own app with it. It will help you deploy it as well. All of it is super seamless and the traction on that has been tremendous since we launched it.

So, some of the models may not be fully optimised, as you said, they may not be regionalised. But we expect a lot of the local community, local developers — we have a partnership going on with local Indian developers — to partner and enable those AI experiences as locally as they want for their respective markets.

computex computex

AI was the dominant theme at Computex Taipei this year

Gadgets 360: AI features are still in the stage of infancy. We’ve seen features that are focused more on helping creatives with their work, providing them new tools. But when do you see some of these features evolving and helping people and communities who are more in need, like aiding people with disabilities or people with less access?

Nitin Kumar: The variety of applications that we’re seeing, to be honest, is actually quite wide, ranging on a broad spectrum. There are creative applications like Co-creator, that help you create images with the help of prompts. Then, there is Llama type models, or large language models, which can write essays. So that’s on the productive side. I’ll give you a few more examples. Djay Pro application can help DJs mix music. You play any song, and it can quickly take the vocals out or just take the drums out or take the bass out and isolated noises. That is an apt algorithm to run on the NPU in a very low latency form fashion, locally on the device.

Then think of it from a developer perspective: You’re a software developer, you can generate code on the fly locally. There are a variety of these applications. I truly believe this is the start of a new sort of revolution and a new class of applications, because the power of the platform is just immense.

Gadgets 360: We’ve seen how quick the pace of innovation for AI has been in the past one-and-a-half years. It feels like when OpenAI kicked open the door, everything else fell out, as well. Do you see this intense pace of innovation being sustained or will it slow down and reach a plateau soon? The PC market in the past decade, before AI, seemed to have flatlined.

Nitin Kumar: Because of the disruption that this technology brings, as with any new technology, I think the pace initially will be quite disruptive and at an inflection point. You can draw a parallel to when smartphones were just coming around, be it the Apple ecosystem or the Android ecosystem. And the App Store was coming out with just 10 apps, 1,000 apps, and then 10,000 apps and then millions. And then it just grew very rapidly from there. And a whole series of use cases emerged out of the device, that were just unimaginable. You can go back another decade and think of the Internet as that moment. It was very hard to say (at the time) how many websites will show up or what kind of use cases will show up, or how does this thing even evolve? It’s unimaginable at that point.

But you know that the essence of the technology has enough of a spark to drive that kind of a revolution. So, my personal belief is that AI is definitely there. And of course it’s very widespread — from a cloud AI perspective, server AI perspective; we’re focusing a lot more on the on-device perspective. When you look at it in totality, this will completely drive a new set of devices, experiences, innovation, apps. The usability aspect of it is just hard to imagine, but that spiral effect, at least for a few years — it might be longer, as well — will continue at a very, very rapid pace. And from a Snapdragon perspective, we’re very excited about where we’re headed.


Apple unveiled its first mixed reality headset, the Apple Vision Pro, at its annual developer conference, along with new Mac models and upcoming software updates. We discuss all the most important announcements made by the company at WWDC 2023 on Orbital, the Gadgets 360 podcast. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.
Affiliate links may be automatically generated – see our ethics statement for details.

Source link

#Snapdragon #Elite #Qualcomm #Break #Intel #AMDs #Hegemony

Nvidia and AI changed landscape of the chip industry, as rivals play catch-up

This year’s artificial-intelligence boom turned the landscape of the semiconductor industry on its head, elevating Nvidia Corp. as the new king of U.S. chip companies — and putting more pressure on the newly crowned company for the year ahead.

Intel Corp.
INTC,
+2.12%
,
which had long been the No. 1 chip maker in the U.S., first lost its global crown as biggest chip manufacturer to TSMC
2330,

several years ago. Now, Wall Street analysts estimate that Nvidia’s
NVDA,
-0.94%

annual revenue for its current calendar year will outpace Intel’s for the first time, making it No. 1 in the U.S. Intel is projected to see 2023 revenue of $53.9 billion, while Nvidia’s projected revenue for calendar 2023 is $56.2 billion, according to FactSet.

Even more spectacular are the projections for Nvidia’s calendar 2024: Analysts forecast revenue of $89.2 billion, a surge of 59% from 2023, and about three times higher than 2022. In contrast, Intel’s 2024 revenue is forecast to grow 13.3% to $61.1 billion. (Nvidia’s fiscal year ends at the end of January. FactSet’s data includes pro-forma estimates for calendar years.)

“It has coalesced into primarily an Nvidia-controlled market,” said Karl Freund, principal analyst at Cambrian AI Research. “Because Nvidia is capturing market share that didn’t even exist two years ago, before ChatGPT and large language models….They doubled their share of the data-center market. In 40 years, I have never seen such a dynamic in the marketplace.”

Nvidia has become the king of a sector that is adjacent to the core-processor arena dominated by Intel. Nvidia’s graphics chips, used to accelerate AI applications, reignited the data-center market with a new dynamic for Wall Street to watch.

Intel has long dominated the overall server market with its Xeon central processor unit (CPU) family, which are the heart of computer servers, just as CPUs are also the brain chips of personal computers. Five years ago, Advanced Micro Devices Inc.
AMD,
+0.90%
,
Intel’s rival in PC chips, re-entered the lucrative server market after a multi-year absence, and AMD has since carved out a 23% share of the server market, according to Mercury Research, though Intel still dominates with a 76.7% share.

Graphics chips in the data center

Nowadays, however, the data-center story is all about graphics processing units (GPUs), and Nvidia’s have become favored for AI applications. GPU sales are growing at a far faster pace than the core server CPU chips.

Also read: Nvidia’s stock dubbed top pick for 2024 after monster 2023, ‘no need to overthink this.’

Nvidia was basically the entire data-center market in the third quarter, selling about $11.1 billion in chips, accompanying cards and other related hardware, according to Mercury Research, which has tracked the GPU market since 2019. The company had a stunning 99.7% share of GPU systems in the data center, excluding any devices for networking, according to Dean McCarron, Mercury’s president. The remaining 0.3% was split between Intel and AMD.

Put another way: “It’s Nvidia and everyone else,” said Stacy Rasgon, a Bernstein Research analyst.

Intel is fighting back now, seeking to reinvigorate growth in data centers and PCs, which have both been in decline after a huge boom in spending on information technology and PCs during the pandemic. This month, Intel unveiled new families of chips for both servers and PCs, designed to accelerate AI locally on the devices themselves, which could also take some of the AI compute load out of the data center.

“We are driving it into every aspect of the applications, but also every device, in the data center, the cloud, the edge of the PC as well,” Intel CEO Pat Gelsinger said at the company’s New York event earlier this month.

While AI and high-performance chips are coming together to create the next generation of computing, Gelsinger said it’s also important to consider the power consumption of these technologies. “When we think about this, we also have to do it in a sustainable way. Are we going to dedicate a third, a half of all the Earth’s energy to these computing technologies? No, they must be sustainable.”

Meanwhile, AMD is directly going after both the hot GPU market and the PC market. It, too, had a big product launch this month, unveiling a new family of GPUs that were well-received on Wall Street, along with new processors for the data center and PCs. It forecast it will sell at least $2 billion in AI GPUs in their first year on the market, in a big challenge to Nvidia.

Also see: AMD’s new products represent first real threat to Nvidia’s AI dominance.

That forecast “is fine for AMD,” according to Rasgon, but it would amount to “a rounding error for Nvidia.”

“If Nvidia does $50 billion, it will be disappointing,” he added.

But AMD CEO Lisa Su might have taken a conservative approach with her forecast for the new MI300X chip family, according to Daniel Newman, principal analyst and founding partner at Futurum Research.

“That is probably a fraction of what she has seen out there,” he said. “She is starting to see a robust market for GPUs that are not Nvidia…We need competition, we need supply.” He noted that it is early days and the window is still open for new developments in building AI ecosystems.

Cambrian’s Freund noted that it took AMD about four to five years to gain 20% of the data-center CPU market, making Nvidia’s stunning growth in GPUs for the data center even more remarkable.

“AI, and in particularly data-center GPU-based AI, has resulted in the largest and most rapid changes in the history of the GPU market,” said McCarron of Mercury, in an email. “[AI] is clearly impacting conventional server CPUs as well, though the long-term impacts on CPUs still remain to be seen, given how new the recent increase in AI activity is.”

The ARMs race

Another development that will further shape the computing hardware landscape is the rise of a competitive architecture to x86, known as reduced instruction set computing (RISC). In the past, RISC has mostly made inroads in the computing landscape in mobile phones, tablets and embedded systems dedicated to a single task, through the chip designs of ARM Holdings Plc
ARM,
+0.81%

and Qualcomm Inc.
QCOM,
+1.12%
.

Nvidia tried to buy ARM for $40 billion last year, but the deal did not win regulatory approval. Instead, ARM went public earlier this year, and it has been promoting its architecture as a low-power-consuming option for AI applications. Nvidia has worked for years with ARM. Its ARM-based CPU called Grace, which is paired with its Hopper GPU in the “Grace-Hopper” AI accelerator, is used in high-performance servers and supercomputers. But these chips are still often paired with x86 CPUs from Intel or AMD in systems, noted Kevin Krewell, an analyst at Tirias Research.

“The ARM architecture has power-efficiency advantages over x86 due to a more modern instruction set, simpler CPU core designs and less legacy overhead,” Krewell said in an email. “The x86 processors can close the gap between ARM in power and core counts. That said, there’s no limit to running applications on the ARM architecture other than x86 legacy software.”

Until recently, ARM RISC-based systems have only had a fractional share of the server market. But now an open-source version of RISC, albeit about 10 years old, called RISC-V, is capturing the attention of both big internet and social-media companies, as well as startups. Power consumption has become a major issue in data centers, and AI accelerators use incredible amounts of energy, so companies are looking for alternatives to save on power usage.

Estimates for ARM’s share of the data center vary slightly, ranging from about 8%, according to Mercury Research, to about 10% according to IDC. ARM’s growing presence “is not necessarily trivial anymore,” Rasgon said.

“ARM CPUs are gaining share rapidly, but most of these are in-house CPUs (e.g. Amazon’s Graviton) rather than products sold on the open market,” McCarron said. Amazon’s
AMZN,
-0.18%

 Graviton processor family, first offered in 2018, is optimized to run cloud workloads at Amazon’s Web Services business. Alphabet Inc.
GOOG,
+0.66%

GOOGL,
+0.63%

also is developing its own custom ARM-based CPUs, codenamed Maple and Cypress, for use in its Google Cloud business according to a report earlier this year by the Information.

“Google has an ARM CPU, Microsoft has an ARM CPU, everyone has an ARM CPU,” said Freund. “In three years, I think everyone will also have a RISC-V CPU….It it is much more flexible than an ARM.”

In addition, some AI chip and system startups are designing around RISC-V, such as Tenstorrent Inc., a startup co-founded by well-regarded chip designer Jim Keller, who has also worked at AMD, Apple Inc.
AAPL,
+0.54%
,
Tesla Inc.
TSLA,
+2.04%

and Intel.

See: These chip startups hope to challenge Nvidia but it will take some time.

Opportunity for the AI PC

Like Intel, Qualcomm has also launched an entire product line around the personal computer, a brand-new endeavor for the company best known for its mobile processors. It cited the opportunity and need to bring AI processing to local devices, or the so-called edge.

In October, it said it is entering the PC business, dominated by Intel’s x86 architecture, with its own version of the ARM architecture called Snapdragon X Elite platform. It has designed its new processors specifically for the PC market, where it said its lower power consumption and far faster processing are going to be a huge hit with business users and consumers, especially those doing AI applications.

“We have had a legacy of coming in from a point where power is super important,” said Kedar Kondap, Qualcomm’s senior vice president and general manager of compute and gaming, in a recent interview. “We feel like we can leverage that legacy and bring it into PCs. PCs haven’t seen innovation for a while.”

Software could be an issue, but Qualcomm has also partnered with Microsoft for emulation software, and it trotted out many PC vendors, with plans for its PCs to be ready to tackle computing and AI challenges in the second half of 2024.

“When you run stuff on a device, it is secure, faster, cheaper, because every search today is faster. Where the future of AI is headed, it will be on the device,” Kondap said. Indeed, at its chip launch earlier in this month, Intel quoted Boston Consulting Group, which forecast that by 2028, AI-capable PCs will comprise 80% of the PC market..

All these different changes in products will bring new challenges to leaders like Nvidia and Intel in their respective arenas. Investors are also slightly nervous about Nvidia’s ability to keep up its current growth pace, but last quarter Nvidia talked about new and expanding markets, including countries and governments with complex regulatory requirements.

“It’s a fun market,” Freund said.

And investors should be prepared for more technology shifts in the year ahead, with more competition and new entrants poised to take some share — even if it starts out small — away from the leaders.

Source link

#Nvidia #changed #landscape #chip #industry #rivals #play #catchup

How AI Is Changing Datacentres, Role of CPUs vs GPUs: Interview With Intel’s Sandra Rivera

Most people don’t really think about datacentres, but we all use Internet-connected apps, streaming services, and communication tools that rely on processing and storing massive amounts of information. As the world gets more connected and it becomes easier to create and distribute huge amounts of data, the systems and processes needed to handle all of it keep evolving. Sandra Rivera, Intel Executive Vice President and General Manager, Data Centre and AI Group, was recently in Bengaluru, and Gadgets 360 had the chance to hear about her take on current trends and her vision for the future. A lot has changed thanks to the pandemic, and of course AI is a huge part of the story going forward.

We first brought you Sandra Rivera’s comments about Intel’s ongoing work in India and everything that the company is doing here. Now, here are some more excerpts from that conversation, about innovation in hardware and software, the evolving nature of datacentres, and competing with Nvidia.

How datacentres are becoming even more important, and how things have changed in the recent past:

Sandra Rivera: All our innovations and products are clearly being driven by our customers. We are in a large and growing TAM [Total Addressable Market] and as we drive forward, nowhere is that more evident than in India, with digital transformation and the digitisation of every part of our lives. We need more compute; we’re creating more data. It needs to be compressed, secured, delivered over a network, and stored. It needs to be served up, and you also need to get valuable insights out of that, which of course is where AI comes in.

One of the interesting things that happened during COVID is that because of supply chain constraints that we all struggled through, we saw customers lean into more utilisation of the infrastructure that they had. AI, networking, and security are very hungry for the latest innovations and solutions, but a lot of the Web tier; office applications that run in cloud infrastructure; ERP systems; accounting systems; etc, are actually very focused on utilisation.

The biggest growth is happening at what we call the edge of the network, or on premises. The compute is coming to the point of data creation and data consumption. A lot of the challenge for us there is partnering with our OEMs to simplify deploying applications on-premise to process that data; to run machine learning, AI, data analytics, networking capabilities, security. That’s a lot of work both in hardware and of course in in software.

That’s true here in India as well. [Some of it] is driven by power constraints and so if they can have power dedicated to those leading-edge applications and infrastructure and then cap the power on more mainstream applications, then that’s a smart use of the power budget, which is a big deal.

India has been so important for us from an R&D perspective; I mean we’ve been here for decades. We also see with all of the investments that the government is making in digital transformation and infrastructure, that India is going to be a huge consumption market for us as well. The opportunity to build out more infrastructure here, more datacentres, more enterprise solutions, software ecosystem solutions, and services, is very exciting. We continue to invest not only in the workforce but also in the market opportunities here.

The continued importance of CPUs even as GPUs are in demand, and how that is disrupting datacentre design:

Sandra Rivera: There are high-growth workloads like AI and networking driven by the continued proliferation of 5G, as well as security and storage. One of the dynamics we’re seeing in the market is that in the near term, there’s a lot of interest for accelerated compute, meaning GPUs and AI accelerators.

Customers are looking to shift a bit of their capital expenditure towards GPUs. The CPU is part of the equation, but in the near term, more of that capex spend is going to go to GPUs. We don’t think that that’s a permanent market condition. The CPU is quite good from a cost-performance-programmability perspective for many AI workloads. In many cases, customers already have a Xeon CPU, and so the fact that they can do AI machine learning [with that] is a tailwind for our business.

Intel AI Continuum

[All that] everyone talks about right now is generative AI and large language models, but AI is much more than that, right? AI is all the data preparation that happens before you train the model; it’s the data management, filtering, and cleaning. So if you are trying to build an application to identify cats, [for example] you don’t want any dogs in those pictures. All of that is done upfront with the CPU and actually almost exclusively with the Xeon today. That’s part of the AI workflow. Then you get to the actual model training phase. The CPU is very well positioned to address small to medium-sized models – 10 billion parameters or lower – or mixed workloads where machine learning or data analytics is part of a broader application. The CPU is very flexible, highly programmable, and you probably have CPUs already.

When you talk about the largest models, with 100, 200, 300 billion parameters – there you need a more parallel architecture, which is what a GPU provides, and you also benefit from dedicated deep learning acceleration, like we have in Gaudi. After you train the model, you get to what we call the inference or deployment phase. Typically, you’re on-premises there. If you are in a retail organization or a fast food restaurant, you will typically be running that on either a CPU or some less power-hungry, less expensive accelerator. In the inference stage, we can compete very effectively with our CPUs and some of our smaller GPUs and accelerators.

Right now, there’s a lot of interest around those largest language models and generative AI. We see more customers saying they want to make sure that they have some GPU capabilities. We do see that dynamic, but long-term, the market is complex. It’s growing. We’re in the early days of AI. We think that we have a very good opportunity to play with the breadth of capabilities that we have across our portfolio. So it’s not that I think that generative AI is small; but it’s not addressable only with a large-scale GPU.

How Intel sees Nvidia, and how it plans to compete

Sandra Rivera: Everyone knows that Nvidia is doing a great job of delivering GPUs to the market. It’s a giant player. Let me put that in perspective. The Gaudi 2 has better performance than the Nvidia A100, which is the most pervasive GPU today. It doesn’t have more raw performance versus H100 right now, but from a price-performance perspective, it’s actually very well positioned. One of the data formats supported in the Gaudi 2 hardware is FP8, and the software to support that is going to be released next quarter. We expect to see very good performance, but you’ll have to wait and see what we publish in November. Next year, we’ll have Gaudi 3 in the market which will be competing very effectively with H100 and even the next generation on the Nvidia roadmap. Our projections look very good. We’re priced very aggressively. Customers want alternatives and we absolutely want to be an alternative to the biggest player in the market. It’s going to be what we do, not what we say.

Intel’s roadmap for building sustainable datacenters.

Sandra Rivera: We use over 90 percent and in some cases 100 percent renewable energy in all our manufacturing across the world. We are second to no one in renewable energy and total carbon footprint for the manufacturing of our products. The competition, like most of the world, is building their products in foundries either in Taiwan or in Korea. Of course Taiwan is the biggest, but the footprint that they have in renewable energy is actually quite small. It’s an island; everything gets shipped using diesel fuel. When we look at the datacentres that we’re building ourselves for our own fabs and our own IT infrastructure, again that’s 90 percent plus renewable energy. We also partner very closely with our OEMs as well as cloud service providers to help optimise around green and renewable energy.

With the 4th Gen Xeon we introduced a power-optimised mode where you can actually use 20 percent less energy by being smart about turning off cores during idle times and tuning the processor. We were able to do that with a very small performance impact, less than 5 percent, and customers like that because they don’t always need the processor to be running at full capability and they can save a lot of energy.

The current state and future potential of neuromorphic and quantum computing in datacentres

Sandra Rivera: Neuromorphic and quantum computing are leading-edge technologies. We’ve been an investor in quantum for at least a decade and a half. We’ve been investors in silicon photonics; optical networking and interconnects have become increasingly interesting, especially in these very high-end, large-scale computing platforms. We know that memory technologies are going to be critical for us going forward. We’ve been investors in memory technologies with partners and on our own. The commercial viability of those technologies are sometimes 10-20 years out, but innovation is the lifeblood of our business. We have extraordinary capabilities with Intel Labs. We have so many fellows, senior fellows and industry luminaries. The process technology is some of the most complex and exquisite engineering in the world.

We’ll continue to lead from an innovation perspective. Commercial viability all depends on how fast markets shift. We do think that AI is disruptive, and some of those technologies will probably be [developed] at an accelerated pace, particularly networking and memory. There are lots of innovations in power and thermals; these chips and systems are getting bigger and hotter. It’s not always easy to answer when the timing is [right]. Some of these technologies may not have commercial success, but you take parts of them and channel them into other areas. I think this is the business of innovation and we’re very proud of our history. Those [teams] get to do a lot of very fun things and they’re very energised.

Some responses have been condensed and slightly edited for clarity.

Disclosure: Intel sponsored the correspondent’s flights for the event in Bengaluru.


Affiliate links may be automatically generated – see our ethics statement for details.

Source link

#Changing #Datacentres #Role #CPUs #GPUs #Interview #Intels #Sandra #Rivera

MSI Titan GT77 HX 13VI Review: The performance brute, reborn with more power and grunt- Technology News, Firstpost

Pros:
– The sheer brute performance
– The CherryMX Mechanical keyboard
– Expandability and upgradability in terms of Storage and RAM
– PCIe Gen 5 M.2 Storage slot
– Excellent thermal management
– The subtle but aggressive aesthetics
– The 4K 144Hz MiniLED display
– Great selection of ports and I/O
– Choice of materials could have been better

Cons:
– The price
– Average webcam considering the other specifications and the price
– Too bulky, even for a gaming laptop

Price: Rs 6,71,990/-
Rating: 4.75/5

Last year, when we reviewed the MSI Titan GT77 UHS, we were very impressed by it. Powered by the Intel i9-12900HX & the NVIDIA RTX 3080Ti Laptop GPU, it was very easily, the most powerful computer that we had used last year. In fact, we had stated in our review that most people don’t have full-fledged desktops that could go toe to toe against the Titan GT77.

Image Credit: Tech2 | Mehul Reuben Das

Well, MSI is back with another iteration of the Titan GT77 and is more bonkers than ever before. At first glance, it seems that not a lot has changed – it still has that patently bonkers gamer aesthetics, and backbreaking heft and bulk from last year. Look closely though and you’ll see that the newest generation of the Titan GT77 is a beast that has reincarnated in a much more stronger avatar. MSI seems to have taken all the numbers from last year’s Titan GT77, and have turned it up to 11.

The newest MSI Titan GT77 is a true enigma in this regard. The Titan GT77 continues to be a legitimate desktop replacement, delivering desktop-level performance in a reasonably portable form factor. Yes, there are a few gaming laptops that have the same sort of specifications that the new MSI Titan GT77 comes with, but they have some omissions or others, that leaves a sour taste in your mouth, especially when you know that there exists a machine, that makes absolutely no compromises when it comes to sustained performance, not just in gaming but other intensive tasks as well.

As always with the Titan GT77, while you can use this machine for intense gaming and experience impressive results, its true purpose lies in tackling far more demanding tasks. Who exactly and what task? Well, that’s exactly the question we will tackle in this review of the MSI Titan GT77 and explore the intended audience for this desktop replacement laptop.

MSI Titan GT77 HX 13VI Review: Specs and features
The unit we tested was the MSI Titan GT77 HX 13VI, featuring an Intel Core i9-13950HX CPU. Our specific configuration included 64GB (2x32GB) of DDR5 RAM in a dual-channel setup, running at 4800MHz. However, it’s worth noting that the laptop supports up to 128GB RAM thanks to an additional two So-DIMM slots

Retail units of the MSI Titan GT77 HX 13VI will be equipped with an Intel Core i9-13980HX CPU, which has slightly more powerful cores, up by 100Mhz. This is supposed to aid in lighter, more office-oriented tasks.

MSI Titan GT77 HX 13VI Review (8)
Image Credit: Tech2 | Mehul Reuben Das

As for the GPU, we had the laptop variant of the NVIDIA RTX 4090, offering 16GB of GDDR6 VRAM. The laptop-grade GPU has a total of 175W of power to play with, so you know that the GOu is well-fed. Additionally, the laptop includes Intel’s UHD Graphics for lighter tasks.

The display on our test unit was a 17.3-inch UHD 4K (3840X2160) MiniLED display, boasting an impressive refresh rate of 144Hz, and certification for HDR1000.

In terms of storage, our unit came equipped with 2 PCIe Gen 4 SSDs of 2TB for a total of 2TB. The MSI Titan GT77 features three M.2 slots, one of which supports PCIe Gen 5.

As for the ports, you get the following:

  • 1x Type-C (USB / DP / Thunderbolt™ 4) with PD charging
  • 1x Type-C (USB / DP / Thunderbolt™ 4)
  • 3x Type-A USB3.2 Gen2
  • 1x SD Express Card Reader
  • 1x HDMI™ 2.1 (8K @ 60Hz / 4K @ 120Hz)
  • 1x Mini-DisplayPort
  • 1x RJ45 that supports up to 2.5G

For wireless connectivity, you get a Killer AX1690i module that supports WiFi 6E as well as Bluetooth 5.3.

MSI Titan GT77 HX 13VI Review (10)
Image Credit: Tech2 | Mehul Reuben Das

Powering the device is a non-removable 4-cell, 99.99Wh battery, accompanied by a 330W charging brick with a proprietary connector.

For security features, the Titan GT77 offers an IR camera and fingerprint-based biometrics, a webcam shutter for the integrated camera, and Firmware Trusted Platform Module(TPM) 2.0.

MSI Titan GT77 HX 13VI Review: Design and build quality
Like last year, The design of the latest generation of the MSI Titan GT77 is far from understated. MSI is actually continuing with the same design that we saw last year, which makes one thing very clear – the Titan GT77 has a legacy and a lineage that MSI feels should be recognisable at once.

MSI Titan GT77 HX 13VI Review (9)
Image Credit: Tech2 | Mehul Reuben Das

The laptop sports an all-black colour scheme and has a very aggressive styling. The prominent vents on the sides and back clearly are the first identifiers of the performance beast that’s lurking under the chassis, even when you switch the RGB lighting off. The rear exhaust vents feature customizable RGB lighting, allowing users to assign different colours to each vent outlet using the SteelSeries GG app.

The MSI Titan GT77 is built like a tank – you can feel the heft just by looking at it. Weighing over 3.3 kilograms on its own it may seem heavy on its own. The power brick to keep this beast juiced up, weighs another 1.4 kilograms. However, considering its purpose as a true desktop replacement and the performance it delivers, the weight becomes more reasonable.

The laptop features a metallic top lid that houses the display. The lid is sturdy and shows minimal flex, and is slightly recessed from the edge of the clamshell, creating a noticeable protrusion at the rear.

Additionally, the lid showcases the illuminated MSI shield logo, enhancing its gamery vibes. It is attached to a robust yet solid hinge that can be easily opened with one hand, preventing unnecessary swaying of the panel As for the bezels, the left, top, and right edges boast thin bezels, while the bottom edge contains a thicker bezel with the MSI logo.

MSI Titan GT77 HX 13VI Review (3)
Image Credit: Tech2 | Mehul Reuben Das

The interior of the laptop is primarily composed of plastic, but it doesn’t feel cheap. However, the choice of materials could have been better as the entire laptop is a fingerprint magnet. This clean and sharp aesthetics of the device are easily besmirched by the user’s fingerprints, and no matter how hard you try, you can’t clean up the laptop with a microfibre cloth every 5 minutes.

The keyboard shows virtually no flex, and on the left and right sides of the keyboard, there are 2W speakers. Towards the bottom-left corner of the keyboard, we see the Cherry MX branding, again, illuminated with RGB.

MSI Titan GT77 HX 13VI Review (6)
Image Credit: Tech2 | Mehul Reuben Das

The bottom of the laptop consists of two parts. The top half is made of metal, most likely aluminium, allowing for better heat dissipation due to the presence of well-positioned vents. The other half is constructed of plastic. Additionally, two 2W woofers can be seen on the bottom of the laptop.

MSI Titan GT77 HX 13VI Review: Keyboard and trackpad
The MSI Titan GT77 is equipped with a low-profile mechanical keyboard developed by SteelSeries, incorporating Cherry MX switches. From the feel and sound of it, it seems to be a Cherry MX Brown switches. This results in one of the finest laptop keyboards available, offering a delightful typing and gaming experience. Additionally, there is a dedicated numeric keypad on the right-hand side, albeit with slightly smaller keys, which enhances typing convenience. It’s important to note that the keys on the numpad, function row and arrow keys may not feel as tactile as the Cherry switches and have a more membrane-like sensation.

MSI Titan GT77 HX 13VI Review (7)
Image Credit: Tech2 | Mehul Reuben Das

Similar to other keyboards featuring Cherry MX switches, the key switches on the Titan GT77 offer excellent actuation and a satisfying tactile feel. The keyboard is equipped with per-key RGB backlighting, allowing users to customize the lighting according to their preferences. The included SteelSeries GG software also allows users to easily create and manage custom profiles, making it one of the most user-friendly configurators.

The trackpad on the laptop is notably large and lacks physical buttons. It functions as a standard multi-gesture trackpad, providing responsive and precise control. The surface of the trackpad feels smooth and pleasant to touch, and it has very precise palm rejection, which is particularly beneficial considering its size. Furthermore, it is very accurate. MSI need not have gone for such a good trackpad for the Titan GT77, considering that a majority of the users will be using a mouse with it anyway. Having said that, we’re glad that MSI did go for it.

MSI Titan GT77 HX 13VI Review (11)
Image Credit: Tech2 | Mehul Reuben Das

MSI Titan GT77 HX 13VI Review: Webcam and speakers
The MSI Titan GT77 still uses a 720p camera which is housed in the thin top bezel of the display. the fact that some people may think of using the webcam to stream, it would have been preferable to see at least a 1080p sensor instead. Further still, considering what people will pay for the laptop and the fact that it is the best of the best when it comes to specifications, a 4K sensor would have been ideal.

MSI Titan GT77 HX 13VI Review (5)
Image Credit: Tech2 | Mehul Reuben Das

Having said that, the image and video quality produced by the webcam is decent enough to serve general purposes, like attending a meeting or a video call. Like last year, the camera gets some IR capabilities, which makes it great for biometrics. One thing that has been added to this year’s GT77 is a physical web shutter for increased privacy.

The built-in microphone, unlike the webcam, performs well and is suitable even for streaming. During video calls, it effectively isolates the speaker’s voice from any surrounding background noise.

The audio output of the device consists of a pair of 2-watt speakers and a pair of 2-watt woofers, all facing the user. These speakers provide high-quality sound with a distinct mid-range, ample separation between high and low frequencies, and a satisfying bass presence. Throughout our testing, we encountered no instances of rattling, distortion, or any other undesirable problems when playing bass-heavy tracks. Although the maximum volume level is somewhat on the lower side, it can still reach sufficient loudness to fill a room, albeit it might leave some users slightly disappointed.

MSI Titan GT77 HX 13VI Review: Display
Last year, the unit we tested came with a 17.3-inch 1080P IPS display that had a refresh rate of 360Hz. The units available in India come with a 4K, 120Hz IPS display. This year, MSI has made some massive updates to the display. For the 2023 version, we get a 4K resolution, measuring 3840 x 2160 pixels, and a refresh rate of 144 Hz. Additionally, there is an alternative option of a QHD IPS screen with a refresh rate of 240 Hz.

MSI Titan GT77 HX 13VI Review (4)
Image Credit: Tech2 | Mehul Reuben Das

Mini-LED screens ensure true black images without any backlight bleeding. MSI promotes over 1000 dimming zones, although some blooming may still be observed. When bright objects are displayed on dark backgrounds, the entire dimming zone illuminates, resulting in bright clouds. However, this effect is primarily noticeable when logos are displayed and not during gaming or regular usage. The average brightness is measured at 600 nits, and the low black value contributes to a remarkably high contrast ratio.

The panel supports HDR 1000, and our tests show that it has a maximum brightness of over 1000 nits. Users will need to manually activate HDR. Moreover, the different colour profiles that come with MSI’s True Color software cannot be used, and HDR cannot be utilized while on battery power. This is because Windows falls spectacularly short when it comes to implementing HDR.

Speaking of True Colour, the Titan GT77 comes equipped with the True Color software, which provides various preconfigured settings for colour spaces and situations such as gaming, office work, and movie viewing. Additionally, the software allows users to calibrate the screen according to their preferences.

MSI Titan GT77 HX 13VI Review: Performance in Productivity and Gaming
This is where the MSI Titan GT77 shines the brightest, and the main reason why the select few people who can actually afford this laptop, should go for it. Last year’s Titan GT77 was a beast when we were looking at its performance. The newer, 2023 version fo the GT77, is an even more powerful, and surprisingly, more efficient beast – but, a beast nonetheless.

The 2023 version of the Titan GT77 comes with an Intel Core i9-13950HX CPU which boosts up to 5.5 GHz, has 32 threads and 24 Cores, 8 of which are the top-tiered Performance Cores, and an additional 16 Efficiency Cores. The P-Cores boost all the way up to 5.5Ghz whereas the E-cores go all the way up to 4GHz.

MSI Titan GT77 HX 13VI Review (1) Benchmarks
Image Credit: Tech2 | Mehul Reuben Das

As far laptop grade CPUs are considered, this is bested only by the Intel Core i9-13950HX, which is what retail customers will be getting. Needless to say, it is the best of laptop CPU out there right now.

Even the Intel Core i9-13950HX CPU is a pretty powerful CPU. The only difference between the two CPUs is the 13980HX offers a 100 MHz higher maximum clock for the P-cores and lets go of the vPro support. vPro won’t help that much with gaming, but it does slightly help with work-related stuff. Nonetheless, the CPU again is a great example of just how raw power and efficiency can be packed into a single SoC. Intel does that using Intel’s hybrid architecture is.

As for the GPU, we get a laptop version of the NVIDIA GeForce RTX 4090, again, the best that a laptop can be equipped with right now. The RTX 4090 in the MSI Titan GT77 boosts up to 2340Mhz and comes with 16GB of GDDR6 VRAM and a TDP of 175W. What this means is that the GPU has plenty of room to stretch its legs and perform as a 4090 should.

The net result is that the Titan GT77 truly is the king of performance among portable laptops. The Intel Core i9-13950HX crushes all benchmarks that you throw at it and is bested only by proper desktop-grade K-series CPUs from Intel. During our testing, it fared better than almost all other laptops that we tested this year, by quite a margin. And thanks to Intel’s hybrid architecture, it got some of the highest scores we have seen across benchmarks, both in single-threaded and multithreaded workloads.

The GPU too crushes every synthetic benchmark that is thrown at it. MSI has given the RTX 4090 a TDP of 175W. Both, the Intel Core i9-13950HX and the RTX 4090 are properly fed when it comes to power. Intel actually allows you to play with the clock speeds of the Core i9-13950HX using MSI Centre’s profiles. We did all of our testing of the laptop, benchmarking and gaming at its Extreme Performance to get the best out of the device.

Apart from keeping the giants properly fed, MSI has also ensured that the CPU and the GPU are adequately cooled. The Titan GT77 comes with a slightly updated version of MSI’s Cooler Boost Titan system that helps maximize the i9-13950HX’s and RTX 4090’s performance efficiency. This year, you get 4 fans and a heat sink with 8 pipes and 6 exhausts.

In benchmarks, the Titan GT77 tops nearly every benchmark that you run it through, as it should. In 3DMark Time Spy it scores 20140, in Cinebench R23 it scores, 2120 for the single core and 30065 for the multicore tests. In PCMark 10, it gets a very solid score of 8801, the highest we have seen in a laptop.

In Pugetbench, it has an overall score of 1213 for Photoshop, and 1530 in Lightroom. And, in Crossmark, it is 1864 for productivity, 2441 for creativity, 1623 for Responsiveness and an overall score of 2051.

We have only seen top-tier desktop CPUs and GPUs scores that are comparable to this. Having said that we have always maintained that benchmark numbers do not necessarily reflect how a device actually performs in real life. For that, you have to turn to gaming and other real-life applications.

We tested out games like Far Cry 5, Far Cry 6, Shadow of The Tomb Raider, Metro Exodus, and the recent Call of Duty Modern Warfare 2. We also played CS:GO, but did not include it in our test results, and the results were just ridiculous, but more on that later.

MSI Titan GT77 HX 13VI Review (1) Gaming
Image Credit: Tech2 | Mehul Reuben Das

As stated earlier, we did all of our testing, including the benchmarks using MSI’s Extreme Performance mode to extract the maximum possible juice from the processor and GPU package. As for in-game settings, we were at the highest possible presets, enabling DLSS where possible, and antialiasing. We tested the game at 1080P because that is what most gamers would go for given the size of the panel, and in 4K given that our unit had a 4K panel.

In 1080P Gaming, with the settings turned all the way up where we could have, and with DLSS on when possible, we had more than an awesome experience. In Far Cry 5, we were averaging 158 FPS, in Far Cry 6, we were getting a comfortable 142 FPS. In Shadow of the Tomb Raider, we were consistently getting an average of 212 FPS. Metro Exodus was giving us an ultra-smooth 131 FPS. And, in Call of Duty Modern Warfare 2 (2022) we were averaging 174.

We get to see a similar story in 4K gaming as well. In Far Cry 5, we were getting 131 FPS, in Far Cry 6, we were getting a pretty smooth 87 FPS. In Shadow of the Tomb Raider, we were getting an impressive 101 FPS. In Metro Exodus, we were getting a very much playable 79 FPS. and, in Call of Duty Modern Warfare 2 (2022) we were averaging a very healthy 83.

Coming to CS:GO, we were getting a ridiculous 600+ FPS at 2K with the details cranked up. We did not bother testing it on 4K, but rest assured, it should be somewhere between 300-400 FPS, at the minimum

MSI Titan GT77 HX 13VI Review: Battery
The Titan GT77 comes with one of the largest batteries to be ever fitted to a laptop. It has a 4-cell, 99.99W/hr battery, and a 330W charging brick. But because of the hardware that this laptop packs and the performance that it delivers, the Titan GT77 isn’t anything to write sonnets about. Still, it lasts a little longer than the last generation’s battery.

MSI Titan GT77 HX 13VI Review (1)
Image Credit: Tech2 | Mehul Reuben Das

On an average day of work, which consisted of a ton of writing, some photo editing, and quite a bit of content consumption on YouTube and Netflix, we got about 6-7 hours of usage at about 40-50 per cent screen brightness. Do note, that this was in Silent mode, which is another and on the Intel GPU.

During our extended battery testing, where we play a 4K Video on YouTube on 75 per cent brightness and 50 per cent volume, with all RGB lights on, the MSI Titan GT77 lasted just under 5 hours. This, from a laptop that is as performance-packed as this, is actually very impressive.

While gaming without the charger, the laptop conked off after 1 hour or so of gaming with reduced screen brightness. The performance did take a minor hit without the charger.

MSI Titan GT77 HX 13VI Review: Verdict
The MSI Titan GT77 is not a machine for everyone, not even the most avid of gamers. Unless you’re planning to take up e-sports and gaming as a career option, or are planning to get into AI/ML development or to render a lot of CAD designs or videos, this is not the laptop for you, For most games and purposes, it is an overkill.

So, who is the Titan GT77 designed for? We believe it is tailored for high-performing content creators, gamers, machine learning engineers, data scientists, and game developers who are frequently on the move and require a true desktop replacement that can be easily transported.

MSI Titan GT77 HX 13VI Review (12)
Image Credit: Tech2 | Mehul Reuben Das

As powerful as the Titan GT77 is it has a few drawbacks. We wish we had a higher-quality webcam, and that it wouldn’t have been priced this prohibitively. Nevertheless, once you experience the powerful synergy between the Core i9-13950HX processor and the RTX 4090 graphics card, these minor drawbacks fade into insignificance.

If you seek uncompromising and unrestrained performance, and if you have the financial means to invest in a laptop priced at around Rs 6.5 Lakhs, then look no further. This is the epitome of what laptops for serious professionals and professional gamers were always intended to be.



Source link

#MSI #Titan #GT77 #13VI #Review #performance #brute #reborn #power #grunt #Technology #News #Firstpost

Intel 13th Gen i7 13700K CPU Review: A processor that’s clearly punching above its class- Technology News, Firstpost

Pros
– Sheer Performance
– Pricing compared to competitors
– Larger Cache and Core Count
– Solid multicore and single-core performance
– Hybrid Architecture
– Compatible with previous Chipsets
– Performance almost at par with the i9 at just a fraction of the cost

Cons
– Power consumption
– Needs a proper cooling solution to extract all the performance.
– Possibly the last generation of CPUs using the LGA 1700 Socket

Price: Rs 44,500
Rating: 4.5/5

Intel 13th Gen i7 13700K Review: Overview
Intel’s i7 lineup of CPUs has been the best for high-end desktops, dedicated to gaming, for years now. Yes, there’s the i9 lineup of Intel CPUs, but unless you’re a serious content creator who’ll actually use all the cores at their highest speeds, the i9 would be like taking a fighter jet to a gunfight. Yes, you can do that, but that’s one overachieving response if there ever was one. 

Image Credit: Tech2 | Mehul Reuben Das

The lower-tiered i5 13600K is a solid mid-tier performer and has been crucial in establishing Intel’s position as the go-to CPU for mid-tier gaming. The i7-13700K takes all that performance, turns the dial up to 11 and just goes for it. 

For a few years now, Intel has had to play catch up with AMD’s Ryzen series and the performance they offered across different price points. With the 13th Gen Intel processors, especially the i5 13600K and the i7 13700K, finally, team red has a competitor that gives a proper and hard fight. 

What sets the latest generation of Intel’s CPUs is their hybrid architecture. Intel started their current hybrid architecture with the Alder lake lineup. With Raptor lake, Intel has fine-tuned and turned the dial all the way to 11. 

Although the 13th Gen CPUs are still based on the same Intel 7 process as the 12th Gen, and it has the same hybrid design of performance cores (P-cores) and efficiency cores (E-cores), the Core i7 13700K has the same number of cores as the Intel Core i9 12900K from the previous generation. All in all the i7 13700K CPUs have twice as many E-cores as the previous generation. Basically, you get 8 Performance cores and 8 Efficiency and 24 threads.

Intel i7 13700K CPU Review (8)
Image Credit: Tech2 | Mehul Reuben Das

It genuinely is a better version of the i9 12900K but for a lot less money. Because hyper-threading is enabled on the core i7 13700k, the P-Cores have a thread count of 16. The E-Cores do not support multithreading. This basically increases the thread count on this SKU to 24 compared to the core i7 12700k’s thread count of 20.

The end result is that you get a fantastic CPU that punches above its weight class, and performs like a beast. 

Intel 13th Gen i7 13700K Review: Specifications
Intel’s 13the Gen CPUs, including the Intel i5 13600K Intel and the i7 13700K is based on the Raptor lake architecture, which is a generational improvement over the Alder lake architecture. Like the previous generation of Intel CPUs, the i7 13700K uses the Intel 1700 socket and is made using the Intel 7 manufacturing process. The 13th Gen CPUs are expected to be the last CPUs built on Intel 7 before the move to Intel 4, and the last to support the Intel 1700 socket.

Intel i7 13700K CPU Review (5)
Image Credit: Tech2 | Mehul Reuben Das

The i7 13700K comes with a total of 16 cores, which consists of 8 P-cores and 8 E-cores, and has a total of 24 threads. The P-Cores have a base clock of 3.40GHz, whereas E-Cores have a base clock of 2.50GHz. Overall, the maximum turbo frequency is 5.4GHz. This is a single thread frequency from the Turbo Boost Max Technology 3, and it is dependent on the temperature and power conditions, as well as the load type.

The P-Cores’ maximum turbo frequency is 5.3GHz. The E-Cores’ maximum turbo frequency is 4.2GHz. As for power consumption, the CPU’s power draw is 125W at base frequencies and 253W at maximum turbo frequency.

Multi-threaded performance gets another boost thanks to Intel’s hyperthreading technology on the higher-boosting P-cores. 

Intel i7 13700K CPU Review (3)
Image Credit: Tech2 | Mehul Reuben Das

All of this makes the entire 13th Gen lineup of CPUs the first consumer-grade desktop CPUs to breach the 5GHz barrier. This also makes the 13th Generation of CPUs the fastest in the world, out of the box. 

The entire L2 cache size is 24MB, while the L3 cache size is 30MB. In comparison, the Intel i7 12700K had an L2 Cachce of 14MB and and L3 cache of 25MB, whereas the Intel i9 12900K had an L2 Cache of 12MB and an L3 cache of 30MB. The increase in Caches across the board is significant. 

Intel has also raised the L2 cache for each p-core from 1.25MB to 2MB and doubled the amount of L2 cache for each cluster of e-cores to 4MB in the 13700K. Intel also increased the L3 cache by 5MB.

As for the onboard graphics, the Core i7 13700K comes with UDH 770 GPU and has a base clock of 300MHz with a boost clock of 1.6GHz. It comprises 32 processing units. The maximum resolution supported by HDMI is 4096X2160 at 60Hz. On Display Port, the output resolution is 7680×4320 at 60Hz. The CPU also supports Intel Quick Sync Video and Clear Video HD technology, with a maximum display support of 4.

Intel i7 13700K CPU Review (4)
Image Credit: Tech2 | Mehul Reuben Das

The Core i7 13700K supports both DDR4 and DDR5 RAM, just like the Core i5 13600K, and the previous-gen Alder lake CPUs. However, with the Raptor lake CPU, users get a faster DDR5 memory controller that supports DDR5 up to 5600MHz. You also get support for up to 128GB RAM in set in dual-channel configuration, along with ECC memory support.

The CPU has eight DMI lanes based on DMI 4.0. As for PCIe lanes, they are compatible with both, PCIe Gen 4.0 and 5.0 and are a total of 20, with 16 for the GPU and 4 for any compatible NVMe SSD. The SKU is compatible with PCIe Gen 4.0 and 5.0.

Intel 13th Gen i7 13700K Review: Compatibility
Although Intel’s 13th Gen CPUs come with the new Z790 chipset, they are also backwards compatible with the Z690 as well as the H670 and B660 chipsets, all of which have the LGA 1700 socket. Just ensure that your motherboards have had their BIOS and firmware updated for the 13th Gen CPUs, and you’ll be good to go. 

Intel i7 13700K CPU Review (7)
Image Credit: Tech2 | Mehul Reuben Das

The Z690 and H670 motherboards do let you use DDR5 memory, but with varying limitations on XMP profiles, so you might want to look into that. The Z790 motherboards, though slightly expensive, are well worth the money spent for they have some new features that the Z690 and similar boards don’t. For example, a Z790 board from a reputable partner manufacturer will have PCIe 5.0 support, as well as a bunch of other creature comforts, which, in the long run, will prove beneficial. 

And since this is an i7 that we are talking about, pairing it with the best available RAM would give you just that little extra boost, so again, we recommend investing in a Z790 board. 

Intel 13th Gen i7 13700K Review: Our test bench
For our tests, we ran the Core i7 13700K with the Z790 motherboard. Given the power consumption of the chip, we did not want to risk leaving any performance on the table, so we went with our own MSI MAG CoreLiquid 360R V2. For RAM, we went with a pair of Vengeance 32GB kit (2x16GB) DDR5 running at 5200MHz that Corsair lent us. 

For the GPU, we were using our own MSI GeForce RTX 3070 Ti Ventus 3X 8G OC, and the Corsair CX750 Watt modular PSU for power.

Intel i7 13700K CPU Review (6)
Image Credit: Tech2 | Mehul Reuben Das

Intel 13th Gen i7 13700K Review: Performance
It should not come as a surprise that the Core i7 13700K completely annihilates the decimates the last generation Core i7 12700K across benchmarks. What is surprising though is that it performs just as well as the last generation Core i9 12900K, in a number of scenarios, with very few sacrifices in terms of power consumption and temperatures. This is in continuation of our findings when we reviewed the Core i5 13600K.

Intel i7 13700K CPU Review Cinebench
Cinebench R23 Benchmarks. Higher numbers are better.

There are a number of reasons why the 13th Gen CPUs are performing so better than the 12th Gen CPUs. First, is the increase in the number of E-Cores across the lineup, but more importantly, in the Core i7 13700K. Second, Intel has increased the size of the L2 caches on its Raptor Lake processors, along with a slight improvement in the read and write speeds.

Intel’s 13th-generation CPUs also include some intriguing capabilities that make life easier for consumers, particularly when it comes to background operations. Thread Director, for example, collaborates with the Windows Scheduler. Essentially, the CPU will categorise the threads and assist the system in determining which ones to employ for tasks. This allows for significantly more efficient management of background operations.

Intel i7 13700K CPU Review Pugetbench
Pugetbench Benchmarks. Higher numbers are better.

If we take a closer look at how the cores behave during the Cinebench, we see that the i7 13700K maintained an all-core frequency of 5.2 GHz for the P-cores and 4.1GHz for the E-cores after an hour of load testing. The 13700K thereafter appeared to sustain a clock frequency of 5.4 GHz under stress for single-core operations.

Please bear in mind that depending on the load type, this turbo frequency is just for a single core, and there are a plethora of variables that might effect it. Thermal headroom, power limit, cooling solution, thermal paste application, how excellent a motherboard is, and so on, as well as the load type.

Intel i7 13700K CPU Review CrossMark
Crossmark benchmarks for the Intel i7 13700K

In games as well, the CPU performed exceptionally well, which, given its benchmark scores, shouldn’t come as a surprise. In Battlefield 5, we were consistently getting 175+ fps at 1440p, whereas in titles like Far Cyr 6, Forza Horizon 5, and Borderlands 3, we were consistently getting over 100fps at 1440p.

Intel 13th Gen i7 13700K Review: Power consumption & thermals
When it comes to core power usage the i7 13700K is close to the i9 12900K but still slightly lower thanks to Raptor Lake architectural upgrades. Having said that, it is more power-hungry than the i7 12700K.

Intel i7 13700K CPU Review Temp
System-wide power draw during Cinebench Benchmarks and Idle. Lower numbers are better.

The i7 13700K’s thermal performance is again at par considering its overall speed. Though the i7 13700K runs at 98-99, fluctuating between the two, without hampering the performance. 

Intel i7 13700K CPU Review Temp
Maximum CPU temperature during Cinebench Benchmarks. Lower numbers are better.

Even under the heaviest of loads, our CPU did not hit the dreaded 100 degrees in a closed case. 

Intel 13th Gen i7 13700K Review: Verdict
Overall, the Core i7-13700K has a well-balanced performance profile, with no obvious shortcomings in certain types of productivity apps. As a result, the 13700K is an agile gamer as well as a great all-rounder processor for productive tasks.

The Core i7 13700K has a strong price-to-performance advantage, which is even greater when we consider motherboard and memory costs. Intel has maintained compatibility for DDR4 memory, enabling a route to substantially lower memory and motherboard costs than AMD’s AM5 ecosystem, which needs DDR5 memory. Intel claims that if a previous-generation 600-series motherboard has appropriate power delivery, you won’t lose any performance, offering up another avenue for value hunters.

Intel i7 13700K CPU Review (1)
Image Credit: Tech2 | Mehul Reuben Das

If you’re looking only to game on your system and are looking for a reasonable, mid-tier gaming tower, then the Intel Core i5 13600K is still the better choice, as you can easily start off lower and then build your system gradually, especially if budget is a constraint for you and you’re sure you won’t get into serious content creation anytime soon. 

However, if you plan on streaming or any serious content creation, you will be better off with the Core i7 13700K. Think of it this way –  it is considerably friendlier on the pocket if you try and spec out a similar machine from team red, and even then, you will be leaving some single-threaded as well as multithreaded performance on the table. You will effectively have to spend considerably more if you plan on getting an AMD machine, that performs like the i7 13700K. And because the performance is almost at par with last year’s i9 12900K, if not better in most scenarios, you get to enjoy the fruits of top-of-the-line CPU, but at a significantly lower cost.



Source link

#Intel #13th #Gen #13700K #CPU #Review #processor #punching #class #Technology #News #Firstpost