Nvidia and AI changed landscape of the chip industry, as rivals play catch-up

This year’s artificial-intelligence boom turned the landscape of the semiconductor industry on its head, elevating Nvidia Corp. as the new king of U.S. chip companies — and putting more pressure on the newly crowned company for the year ahead.

Intel Corp.
which had long been the No. 1 chip maker in the U.S., first lost its global crown as biggest chip manufacturer to TSMC

several years ago. Now, Wall Street analysts estimate that Nvidia’s

annual revenue for its current calendar year will outpace Intel’s for the first time, making it No. 1 in the U.S. Intel is projected to see 2023 revenue of $53.9 billion, while Nvidia’s projected revenue for calendar 2023 is $56.2 billion, according to FactSet.

Even more spectacular are the projections for Nvidia’s calendar 2024: Analysts forecast revenue of $89.2 billion, a surge of 59% from 2023, and about three times higher than 2022. In contrast, Intel’s 2024 revenue is forecast to grow 13.3% to $61.1 billion. (Nvidia’s fiscal year ends at the end of January. FactSet’s data includes pro-forma estimates for calendar years.)

“It has coalesced into primarily an Nvidia-controlled market,” said Karl Freund, principal analyst at Cambrian AI Research. “Because Nvidia is capturing market share that didn’t even exist two years ago, before ChatGPT and large language models….They doubled their share of the data-center market. In 40 years, I have never seen such a dynamic in the marketplace.”

Nvidia has become the king of a sector that is adjacent to the core-processor arena dominated by Intel. Nvidia’s graphics chips, used to accelerate AI applications, reignited the data-center market with a new dynamic for Wall Street to watch.

Intel has long dominated the overall server market with its Xeon central processor unit (CPU) family, which are the heart of computer servers, just as CPUs are also the brain chips of personal computers. Five years ago, Advanced Micro Devices Inc.
Intel’s rival in PC chips, re-entered the lucrative server market after a multi-year absence, and AMD has since carved out a 23% share of the server market, according to Mercury Research, though Intel still dominates with a 76.7% share.

Graphics chips in the data center

Nowadays, however, the data-center story is all about graphics processing units (GPUs), and Nvidia’s have become favored for AI applications. GPU sales are growing at a far faster pace than the core server CPU chips.

Also read: Nvidia’s stock dubbed top pick for 2024 after monster 2023, ‘no need to overthink this.’

Nvidia was basically the entire data-center market in the third quarter, selling about $11.1 billion in chips, accompanying cards and other related hardware, according to Mercury Research, which has tracked the GPU market since 2019. The company had a stunning 99.7% share of GPU systems in the data center, excluding any devices for networking, according to Dean McCarron, Mercury’s president. The remaining 0.3% was split between Intel and AMD.

Put another way: “It’s Nvidia and everyone else,” said Stacy Rasgon, a Bernstein Research analyst.

Intel is fighting back now, seeking to reinvigorate growth in data centers and PCs, which have both been in decline after a huge boom in spending on information technology and PCs during the pandemic. This month, Intel unveiled new families of chips for both servers and PCs, designed to accelerate AI locally on the devices themselves, which could also take some of the AI compute load out of the data center.

“We are driving it into every aspect of the applications, but also every device, in the data center, the cloud, the edge of the PC as well,” Intel CEO Pat Gelsinger said at the company’s New York event earlier this month.

While AI and high-performance chips are coming together to create the next generation of computing, Gelsinger said it’s also important to consider the power consumption of these technologies. “When we think about this, we also have to do it in a sustainable way. Are we going to dedicate a third, a half of all the Earth’s energy to these computing technologies? No, they must be sustainable.”

Meanwhile, AMD is directly going after both the hot GPU market and the PC market. It, too, had a big product launch this month, unveiling a new family of GPUs that were well-received on Wall Street, along with new processors for the data center and PCs. It forecast it will sell at least $2 billion in AI GPUs in their first year on the market, in a big challenge to Nvidia.

Also see: AMD’s new products represent first real threat to Nvidia’s AI dominance.

That forecast “is fine for AMD,” according to Rasgon, but it would amount to “a rounding error for Nvidia.”

“If Nvidia does $50 billion, it will be disappointing,” he added.

But AMD CEO Lisa Su might have taken a conservative approach with her forecast for the new MI300X chip family, according to Daniel Newman, principal analyst and founding partner at Futurum Research.

“That is probably a fraction of what she has seen out there,” he said. “She is starting to see a robust market for GPUs that are not Nvidia…We need competition, we need supply.” He noted that it is early days and the window is still open for new developments in building AI ecosystems.

Cambrian’s Freund noted that it took AMD about four to five years to gain 20% of the data-center CPU market, making Nvidia’s stunning growth in GPUs for the data center even more remarkable.

“AI, and in particularly data-center GPU-based AI, has resulted in the largest and most rapid changes in the history of the GPU market,” said McCarron of Mercury, in an email. “[AI] is clearly impacting conventional server CPUs as well, though the long-term impacts on CPUs still remain to be seen, given how new the recent increase in AI activity is.”

The ARMs race

Another development that will further shape the computing hardware landscape is the rise of a competitive architecture to x86, known as reduced instruction set computing (RISC). In the past, RISC has mostly made inroads in the computing landscape in mobile phones, tablets and embedded systems dedicated to a single task, through the chip designs of ARM Holdings Plc

and Qualcomm Inc.

Nvidia tried to buy ARM for $40 billion last year, but the deal did not win regulatory approval. Instead, ARM went public earlier this year, and it has been promoting its architecture as a low-power-consuming option for AI applications. Nvidia has worked for years with ARM. Its ARM-based CPU called Grace, which is paired with its Hopper GPU in the “Grace-Hopper” AI accelerator, is used in high-performance servers and supercomputers. But these chips are still often paired with x86 CPUs from Intel or AMD in systems, noted Kevin Krewell, an analyst at Tirias Research.

“The ARM architecture has power-efficiency advantages over x86 due to a more modern instruction set, simpler CPU core designs and less legacy overhead,” Krewell said in an email. “The x86 processors can close the gap between ARM in power and core counts. That said, there’s no limit to running applications on the ARM architecture other than x86 legacy software.”

Until recently, ARM RISC-based systems have only had a fractional share of the server market. But now an open-source version of RISC, albeit about 10 years old, called RISC-V, is capturing the attention of both big internet and social-media companies, as well as startups. Power consumption has become a major issue in data centers, and AI accelerators use incredible amounts of energy, so companies are looking for alternatives to save on power usage.

Estimates for ARM’s share of the data center vary slightly, ranging from about 8%, according to Mercury Research, to about 10% according to IDC. ARM’s growing presence “is not necessarily trivial anymore,” Rasgon said.

“ARM CPUs are gaining share rapidly, but most of these are in-house CPUs (e.g. Amazon’s Graviton) rather than products sold on the open market,” McCarron said. Amazon’s

 Graviton processor family, first offered in 2018, is optimized to run cloud workloads at Amazon’s Web Services business. Alphabet Inc.


also is developing its own custom ARM-based CPUs, codenamed Maple and Cypress, for use in its Google Cloud business according to a report earlier this year by the Information.

“Google has an ARM CPU, Microsoft has an ARM CPU, everyone has an ARM CPU,” said Freund. “In three years, I think everyone will also have a RISC-V CPU….It it is much more flexible than an ARM.”

In addition, some AI chip and system startups are designing around RISC-V, such as Tenstorrent Inc., a startup co-founded by well-regarded chip designer Jim Keller, who has also worked at AMD, Apple Inc.
Tesla Inc.

and Intel.

See: These chip startups hope to challenge Nvidia but it will take some time.

Opportunity for the AI PC

Like Intel, Qualcomm has also launched an entire product line around the personal computer, a brand-new endeavor for the company best known for its mobile processors. It cited the opportunity and need to bring AI processing to local devices, or the so-called edge.

In October, it said it is entering the PC business, dominated by Intel’s x86 architecture, with its own version of the ARM architecture called Snapdragon X Elite platform. It has designed its new processors specifically for the PC market, where it said its lower power consumption and far faster processing are going to be a huge hit with business users and consumers, especially those doing AI applications.

“We have had a legacy of coming in from a point where power is super important,” said Kedar Kondap, Qualcomm’s senior vice president and general manager of compute and gaming, in a recent interview. “We feel like we can leverage that legacy and bring it into PCs. PCs haven’t seen innovation for a while.”

Software could be an issue, but Qualcomm has also partnered with Microsoft for emulation software, and it trotted out many PC vendors, with plans for its PCs to be ready to tackle computing and AI challenges in the second half of 2024.

“When you run stuff on a device, it is secure, faster, cheaper, because every search today is faster. Where the future of AI is headed, it will be on the device,” Kondap said. Indeed, at its chip launch earlier in this month, Intel quoted Boston Consulting Group, which forecast that by 2028, AI-capable PCs will comprise 80% of the PC market..

All these different changes in products will bring new challenges to leaders like Nvidia and Intel in their respective arenas. Investors are also slightly nervous about Nvidia’s ability to keep up its current growth pace, but last quarter Nvidia talked about new and expanding markets, including countries and governments with complex regulatory requirements.

“It’s a fun market,” Freund said.

And investors should be prepared for more technology shifts in the year ahead, with more competition and new entrants poised to take some share — even if it starts out small — away from the leaders.

Source link

#Nvidia #changed #landscape #chip #industry #rivals #play #catchup

AMD Radeon RX 7600 GPU Review: Presenting the new king of 1080P gaming- Technology News, Firstpost

– Awesome performance in 1080P ultra settings
– Pretty decent performance in 1440P ultra settings
– Ports, especially DisplayPort 2.1
– Compact design, meaning can be fit into smaller cases
– AV1 support
– Price to performance. A true VFM GPU

– 8GB VRAM and 128-Bit bus are adequate for now but will struggle in a few years
– Faces tough competition from previous-generation cards
– Lacks bling. No RGB

Rating: 4.25/5
Pricing: Rs 26,500

In the last couple of years, AMD has really made a case for itself that would force any casual gamer to take their GPUs seriously and a properly viable option. This is mainly down to three factors – NVIDIA, the people who would like you to believe that Moore’s law is dead (long live Moore’s Law!), has been all over the place with the pricing, as well as the performance of their RTX 3000 series and RTX 4000 series GPU. The 4000 series, particularly, has been very questionable.

Image Credit: Tech2 | Mehul Reuben Das

Second, Intel, despite designing a very good GPU has been haunted by its drivers since the day of their launch. Yes, over the months, their GPUs have become a force to be reckoned with, but the impression their cards gave at the time of the launch is hard to shake off.

And finally, and most importantly, AMD, for the last two to three generations of their CPUs, have been performing very solidly, especially when you consider the price-to-performance ratio. Sure, it may not have topped all benchmarks or gaming FPS charts, but the price that they came in, usually meant a difference of a few percentage points – something that’s barely noticeable in real-life applications.

The headlines that the RX 6600 made last year were largely positive because it was a damn good card, especially for its price back then. Today, the situation is different. GPUs are easily and readily available, and no matter how much they try, NVIDIA can’t gouge its customers as easily as it did last year.

AMD is positioning the RX 7600 as a 1080p gaming GPU, mainly because, as per Steam’s hardware survey, 65 per cent of gamers play at that resolution, with most of them using either a GeForce 1060 6GB or an RTX 2060 6GB. For such gamers, the RX 7600 with its RDNA 3 architecture, is supposed to be the perfect GPU to upgrade to. In such a scenario, is the RX 7600 worth it?

AMD Radeon RX 7600 GPU Review (2)
Image Credit: Tech2 | Mehul Reuben Das

AMD Radeon RX 7600 GPU Review: Specifications and Features
The Radeon RX 7600 is built on AMD’s RDNA 3 architecture and is produced using TSMC’s 6nm manufacturing process. It features 128 Texture Mapping Units, 64 Raster Operation Processors, and 32 Ray Tracing units. With 2,048 Streaming Processors, it operates at a base clock speed of 2,250MHz and can boost up to 2,625MHz. The GPU is equipped with 8GB of GDDR6 memory and a 128-bit memory interface, providing a maximum bandwidth of up to 476.9 GB/s thanks to AMD Infinity Cache.

To connect to your PC, it utilizes a PCIe Gen 4.0 X8 slot. According to AMD, the GPU has a power draw of 165W TBP (total board power), requiring a 1X8-pin connector. However, a 550W power supply is sufficient to fully utilize the GPU’s capabilities, so there is no need to upgrade your power supply unit.

AMD Radeon RX 7600 GPU Review (3)
Image Credit: Tech2 | Mehul Reuben Das

Design-wise, the Radeon RX 7600 is a simplistic card – it has no RGB lighting of any sort, but the reference card that we tested had a rather cool-looking backplate. There are two fans with 9 blades each with integrated rims, which are 78mm in diameter. The whole GPU itself is about 8-inches in length. The GPU occupies up to full-length slots on your typical chassis. What this means is that the card isn’t as thick as some of the latest offerings from NVIDIA, and can actually be used in compact builds as well.

While the 128-bit bus may seem narrow, AMD is able to not only remedy this but improve efficiency and performance, by using the 2nd Generation AMD Infinity Cache which has been increased. This cache level brings a new approach to data delivery in GPUs. The cache hierarchy has been carefully optimized to strike the right balance between Infinity Cache and L2 cache, ensuring optimal performance and efficiency.

The 2nd Generation AMD Infinity Cache serves as a global cache, providing fast access to data and acting as a powerful bandwidth amplifier. It enhances the GPU’s performance by delivering high-performance bandwidth while maintaining superb power efficiency. This innovative cache design significantly contributes to the overall efficiency and capabilities of the RDNA 3 architecture.

The Radeon RX 7600 has the usual three DisplayPort 2.1 UHBR13.5 ports and a single HDMI 2.1 port. This is one area where AMD’s latest generation hardware exceeds Nvidia’s Ada design, which still uses DisplayPort 1.4a.

AMD Radeon RX 7600 GPU Review (4)
Image Credit: Tech2 | Mehul Reuben Das

AMD has designed the Radeon RX 7600 with a particular focus on streamers. That is why you get support for hardware-accelerated media like AV1, HEVC, H.264, VP9 etc.

The RX 7600 also has a bunch of features that make gameplay very smooth, thus making the GPU very appealing. For example, we have FSR 2.0, or AMD FidelityFX Super Resolution 2, the latest iteration of AMD’s open-source spatial upscaling technology. This innovative technology is designed to enhance framerates and provide gamers with exceptional high-quality, high-resolution gaming experiences. Think of FSR 2.0 as a magic sauce that AMD applies on your games that make them run buttery smooth, crisp and tack sharp.

In addition, AMD’s Radeon Super Resolution (RSR) utilizes the power of AMD FidelityFX Super Resolution (FSR) 1 spatial upscaling within the driver. This feature allows for enhanced performance in thousands of games, completely free of charge. RSR can be enabled globally or on a per-game basis, giving users flexibility in their settings.

When enabled, a slider becomes available, offering further customization options to adjust the sharpness effect of Radeon Super Resolution. A display box will indicate whether RSR is active or inactive, along with the resolution that has been upscaled. To provide an example, if you have a 1440p monitor and you lower your game’s resolution to 1920×1080, RSR will upscale the 1080p resolution to match 1440p.

We have previously discussed the RX 7600’s SAM (Smart Access Memory) feature. SAM allows the GPU to communicate directly with the CPU, enabling the spillage of frames from the VRAM to the system RAM when the VRAM becomes filled. In simple terms, Smart Access Memory (SAM) is an optimization technique that enhances the processor’s access to a graphics card’s VRAM memory.

This enables the CPU to efficiently transfer a large amount of data to the GPU and offload its graphics calculations. As a result, the GPU can generate frames at a much faster rate, significantly reducing the time required for frame generation. In essence, SAM improves the collaboration between the CPU and GPU, resulting in faster graphics rendering and a smoother overall gaming experience. Do note that SAM or ReBAR is available on recent CPUs only. For Intel, you need 11th Gen CPUs or later, and for AMD, Ryzen 5000 Series CPUs and newer, along with 400 Series motherboards or newer.

AMD Radeon RX 7600 GPU Review (5)
Image Credit: Tech2 | Mehul Reuben Das

Additionally, AMD SmartAccess Video intelligently distributes decoding and encoding workloads across all available video engines. By utilizing the video compression engines on both the Ryzen processor and Radeon graphics, it optimizes the distribution of video-related tasks. This results in fewer dropped frames and an overall faster experience when it comes to video editing and transcoding.

The RX 7600 comes with hardware-level support for AV1 encoding and decoding, making it a compelling choice for streamers. AV1 offers significantly better image quality compared to H.264, while requiring only a fraction of the bitrate. As a result, video game streaming with AV1 encoding places much less strain on computing resources compared to the traditional H.264 standard. This feature allows streamers to deliver high-quality streams with reduced computational overhead, enhancing the streaming experience for both content creators and viewers.

AMD Radeon RX 7600 GPU Review: Our testing rig
We paired our test unit of Radeon RX 7600 GPU with an AMD 7900X, 32GB Kingston Fury (2X16) RAM rated at 6000 MT/s, all connected to an Aorus X670E Master motherboard from Gigabyte. Cooling the CPU was an AMD Wraith PRISM cooler. Powering everything was the CoolerMaster MWE 750W V2 80 plus bronze PSU.

AMD Radeon RX 7600 GPU Review (6)
Image Credit: Tech2 | Mehul Reuben Das

As always we did not overclock or change any settings before running our benchmarks and testing out games. The only change that we made was to enable EXPO so that our RAM operated at its rated speed and ensured that ReBAR or as AMD calls it Smart Access Memory, or SAM was enabled, which, AMD enables by default if you have a compatible motherboard and GPU. Other than this we ran everything on stock.
The reason why we checked if SAM was enabled is simple – although not all games benefit greatly by ReBAR or SAM, the ones that do, have 18-20 per cent more frame rates compared to systems that don’t have the option to enable SAM or ReBAR.
We ran the games at the highest possible settings whenever possible. We also had FSR 2.0 enabled in all the games that supported it. We also tested the games at 1440P.

AMD Radeon RX 7600 GPU Review: Performance
The Radeon RX 7600 is heading straight for the RTX 3060, without making any qualms. The RX 7600 simply crushes the best of RTX 3060s in practically almost every scenario, and that too while coming in at a considerably lower amount. In fact, in some scenarios, it performed almost as good as the RTX 3060Ti as well as the RTX 3070.

AMD Radeon RX 7600 GPU Review (6)

The 3DMark scores for the Radeon RX 7600 align with our expectations. It performs comparably to the RTX 3060, showing similar performance levels. However, in Furmark, the RX 7600 outperforms the competition by a significant margin. In terms of compute performance, LuxMark results are not in favour of the RX 7600, as it falls behind other cards. Nevertheless, its performance in Superposition remains highly competitive when compared to the aforementioned cards. Overall, the RX 7600 demonstrates strong performance in various benchmark tests, showcasing its capabilities in different scenarios.


Indeed, the Radeon RX 7600 is well-suited for 1080p gaming, making it a reliable choice for that resolution. However, it doesn’t limit you to just 1080p gaming. Less demanding games will perform excellently at 1440p, and in certain cases, even 4K resolution is achievable. On the other hand, there may be extremely demanding games that struggle to run smoothly at 1080p with maximum settings. Nevertheless, in general, mainstream GPUs like the RX 7600 are typically considered ideal for 1080p gaming, offering a balance between performance and affordability.


What worries us, however, is the 8GB VRAM and 128-bit bus. Sure, it is good enough for most games today. And in certain games where the 8GB VRAM fell short – like in The Last Of Us bumping down the settings from Ultra surely does help.


But consider this – game developers are spending less and less time on optimising their games for budget hardware. Moreover, in 2024 and 2025, we have games like GTA VI launching. We can only imagine what sort of texture packs are those going to bring, by the time they come to PC. While 8GB VRAM and 128-bit bus be adequate for now for 1080P ultra gaming, they won’t be nearly as adequate in a couple of years when games become more demanding, and people start moving towards 1440P and possibly even 2K or 4K.

AMD Radeon RX 7600 GPU Review: Conclusion
When considering whether to purchase the Radeon RX 7600 or explore options from NVIDIA or Intel, several factors come into play. Taking into account the price difference between the RX 7600 and even the basic RTX 3060 GPUs, we wholeheartedly recommend the RX 7600 for those building a new system specifically for 1080p gaming. Even at 1440p, the RX 7600 remains a sensible choice, especially when considering the Arc 750 from Intel.

However, if you’re planning to upgrade a GPU that is only a couple of years old or if your system is equipped with an older CPU (such as Intel’s 8th Generation or earlier, or an older AMD CPU), it may be more advantageous to explore other options.

AMD Radeon RX 7600 GPU Review (7)
Image Credit: Tech2 | Mehul Reuben Das

In terms of gaming performance, the RX 7600 performs almost as well as the RTX 3060 in most games and even outperforms it in some cases. If ray tracing is of utmost importance, even more so than higher frame rates, then opting for an RTX option may be preferable, albeit at a higher cost.

For 1080p gaming, the Radeon RX 7600 emerges as a strong contender and arguably the best option currently available. While the card may have some limitations, it surpasses many of its competitors and offers a more appealing price point.

While Nvidia’s offerings are a viable alternative, they come at a significantly higher cost, and the performance increase in most tests and games do not justify the extra expense. Therefore, if you prioritize a balance between performance and affordability for 1080p gaming, the Radeon RX 7600 is a compelling choice that delivers solid results without breaking the bank.

The RX 7600 boasts certain advantages over its NVIDIA counterparts, making it the preferred choice in many scenarios. For streamers and content creators, the Intel Arc 750 appears to be an ideal GPU option. Similarly, for those building, a budget-oriented editing and/or gaming PC, the RX 7600 remains a strong contender. Additionally, the support for AV1 video, although not immediately crucial, is a format that streamers and content creators who understand its benefits will likely transition to in the future.

Source link

#AMD #Radeon #GPU #Review #Presenting #king #1080P #gaming #Technology #News #Firstpost

The Future of GA Treatment

There’s new hope for people living with geographic atrophy (GA), an advanced form of the eye disease dry age-related macular degeneration (dry AMD). Scientists hope they’re close to new therapies for the condition that’s proven hard to treat in the past.

In dry AMD, small yellow lesions called drusen form under your eye’s retina. If they grow larger, drusen can block nutrients from reaching the retina and cause cell death. Your eyesight becomes blurred, and if AMD advances to GA, you may have trouble seeing from the center part of your vision.

There are two forms of AMD, wet and dry. Dry AMD affects around 90% of all people with AMD and usually gets worse more slowly. Although treatments for wet AMD have evolved quickly in the past few years, innovations in the dry form of the condition have come at a slower pace. 

Michael Cooper, OD, an optometrist and director of medical education at Eyes on Eyecare, calls GA “a currently irreversible, visually devastating disease for millions of people.” 

“We want to help people with GA take back some control and empower them by identifying GA earlier on, so they can live their life the way they want,” he says. And while vision loss from GA is permanent, future treatments may stop or slow the disease from getting worse over time. 

Right now, the only treatments that might reduce the progression of dry AMD are vitamins and supplements. And once the illness advances to GA, there are no therapies – vision loss in these areas is permanent. Recently though, researchers have made exciting breakthroughs in pursuing treatments for GA, including medicines and surgery. 

What Is the Role of the Complement System in GA?

Many of the emerging treatments for GA work to control a part of your immune system called the complement system. These two systems team up to protect you from things that can make you sick such as viruses and bacteria. Your complement system enhances your immune system by switching on proteins that help keep you healthy.

About 50 tiny proteins in your blood’s plasma make up your complement system. Normally, these proteins are idle until something triggers them, like when you’re injured or fighting off bacteria. This sets off a protective chain reaction called a cascade, where one protein switches on, followed by another and another.   

Sometimes, proteins in your complement system work too hard, and your body triggers them too often. When this happens, it raises your chances of disease, including AMD, which can lead to GA.

What Are Some Promising Treatments for GA?

The most promising treatments for GA target the complement system. Cooper says researchers haven’t had a strong grasp of the science behind GA, but recently, the complement system “has become the marquee area of geographic atrophy research.”

Researchers have homed in on two types of protein in your blood, the C3 and C5 proteins. Usually, these proteins get rid of germs that make you ill, but they can cause inflammation and also attack healthy cells.

Researchers think C3 and C5 play a critical role in whether you’ll get AMD and eventually GA. They’ve been studying treatments that work to keep the complement system in check and slow the growth of GA lesions. While early clinical trials weren’t successful, recent studies have shown more potential.

Complement Inhibitors

One possible treatment is an eye injection called a complement inhibitor. It works by slowing C3 and the growth of GA lesions in people with dry AMD. A study of the therapy, named pegcetacoplan (Syfovre), found it can help slow lesion growth in those who have monthly shots and those who get shots every other month. 

Based on the results of three studies, The FDA has fast-tracked the drug. The fast-track process speeds up the development and review of important new treatments so they can get to people sooner. The FDA considers whether the drug will fill an “unmet medical need,” meaning there’s currently no treatment for a specific medical condition, like geographic atrophy.

Another complement inhibitor, called avacincaptad pegol (Zimura), slows GA from getting worse by targeting the C5 protein. One study found that people who took the drug, given as an eye injection, slowed GA by around 27% over 12 months. 

In late 2022, the FDA named the treatment a breakthrough therapy. Like fast track, this process also speeds the development and review of certain drugs. A breakthrough therapy aims to treat a serious condition, and early evidence may show that the drug has an advantage over an available treatment. 

Besides shots, researchers are also studying complement inhibitors in tablet form. These clinical trials are not as far along as the ones for treatments you take as a shot.

Gene Therapy Surgery

One possible downside of eye injections is that you may need them once a month or every 2 months for life. But researchers are looking at another option for GA that you would need just one time. 

It’s a gene therapy designed to help the eye make a protein called complement factor I (CFI). CFI keeps complement in check, and boosting it with a one-time shot delivered beneath the retina can balance out an overactive complement system. 

Cooper says gene therapy is the next wave of treatments for GA. “As time progresses, we get more sophisticated with our ability to formulate these medications, and I think we’ll see more of this type of delivery.” 

Early study data found most people who had the treatment showed higher CFI levels. Some saw these results more than a year post-treatment. Researchers continue to study gene therapy for GA in ongoing clinical trials. 

Modified Vitamin A

Vitamin A is essential for vision but can turn toxic and form what scientists call “dimers.” Researchers have long thought that dimers play a role in whether you’ll get dry AMD. Now, they’re studying a chemically modified form of vitamin A that could ward off and treat dry AMD. 

The drug, a capsule called ALK-001, replaces your body’s natural vitamin A with a version that slows the dimer-making process. Scientists are currently investigating how well the drug works to slow GA.   

Source link

#Future #Treatment