Close Menu
    Login
    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org
    • Home
    • Technology
    • Daily Tech
      • Science and Technology
    • Gadgets
    • Gaming
    • Space Exploration
    • Scope
    • Tech News
    Facebook X (Twitter) Instagram Pinterest YouTube WhatsApp
    Facebook X (Twitter) Instagram
    NewTechManiaNewTechMania
    Login
    • Home
    • Blog
    • Gadgets
      • Gaming
    • Technology
      • Science
    • Automobile
    • Exploration
    • Scope
    • Tech News
    NewTechManiaNewTechMania
    Daily Tech

    Samsung Unveils New Type of High-Bandwidth, Low-Power DRAM – technology

    By Skypeak Limits11 January 2024No Comments3 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    Samsung LPDDR5X DRAM 1B
    Samsung LPDDR5X DRAM 1B
    Share
    Facebook Twitter LinkedIn Pinterest Email

    It is claimed that the memory provides 128GB/s of bandwidth while using electricity.

    The Consumer Electronics Show (CES) is now taking place, as you may have heard. This means that every technology business is announcing how its goods benefit the artificial intelligence industry or how they employ AI. This is the case for Samsung, which has recently introduced a new type of high-bandwidth, low-latency PC memory that has the potential to compete with DDR5 in terms of speed. This certainly sounds promising; however, Samsung claims that it was created with one thing in mind—artificial intelligence—so it is not obvious whether it will replace DDR modules or serve as a low-power option for new applications.

    In a press statement that was labeled as “editorial” and sent this week, Samsung discussed how its memory products will “harness the AI era.” A number of broad and unfounded assertions are made about artificial intelligence in the editorial, such as the following: “At home, it is making our lives easier and more enjoyable.” The juicy part, on the other hand, describes how it is working on memory goods that will enable artificial intelligence models to run on devices. Because of their scale and the amount of computing that they demand, these models will be able to run in the cloud, which will provide a significant problem for the industry. A sort of memory that we have never heard of before is revealed on the list of items that enable an artificial intelligence model to function locally rather than in the cloud. This type of memory is called Low Latency Wide I/O (LLW) DRAM, and it is manufactured by Samsung.

    Regarding this enigmatic new type of memory, the editorial does not disclose a great deal of information. According to Tom’s Hardware, it only reports that it provides a bandwidth of up to 128 gigabits per second, which is comparable to that of a DDR5-8000 module. However, it is only capable of achieving that throughput with 1.2pJ/b, which is a relatively low amount and suggests that it may be geared at mobile devices such as smartphones and laptops rather than desktop home computers. In addition, there is no indication of the speed at which these modules are capable of accomplishing this feat of high bandwidth and low power consumption using these modules. We will have to wait for additional information regarding the products that it may be utilized for, as well as if it will be a solely enterprise offering or whether it will also be available for client apps.

    Taking into consideration that these AI models will move away from the cloud and onto our devices, the rise of artificial intelligence will surely bring tremendous hurdles for businesses such as Samsung. Neural Processing Units (NPU) and tensor cores are examples of dedicated hardware that AMD, Intel, and Nvidia have begun to talk about in order to highlight the advantages of running artificial intelligence applications on-device. On the other hand, there is a lack of processing power and applications that can be run in order to operate them properly. LLW DRAM is one example of a potential solution that could be included in the family of technologies that are emerging to facilitate the movement of artificial intelligence models from the cloud to the device. Nevertheless, as is typically the case with everything of this nature, we will have to wait and see if this technology has any legs or if it will become the new metaverse by the time this year comes to a close.

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleTikTok removed a hashtag-tracking tool that academics were using to examine the app – technology
    Next Article NASA Postpones Launches of Artemis II and III – technology

    Related Posts

    Sam Altman Says Mission Driven AI Talent Will Outperform Meta’s

    Skypeaklimits 2024: Your Digital Success Elevate Your Presence

    OpenAI partners with Palmer Luckey’s Anduril to build military AI

    MS assures Windows 11 TPM security requirement won’t change

    Add A Comment

    Comments are closed.

    NewTechMania Logo

    About Us
    Embark on a tech adventure with NewTechMania. From the latest gadgets to emerging technologies, join us in exploring the possibilities that lie ahead.

    Catergories
    • Home
    • Technology
    • Daily Tech
      • Science and Technology
    • Gadgets
    • Gaming
    • Space Exploration
    • Scope
    • Tech News
    Useful Links
    • Home
    • About Us
    • Contact Us
    • Get In Touch
    Facebook X (Twitter) Instagram Pinterest
    • Privacy
    • Cookie
    • Disclaimer
    • Terms
    • DMCA
    • About
    • Contact
    © 2025 NewTechMania. All RightS Reserved.

    Type above and press Enter to search. Press Esc to cancel.

    Sign In or Register

    Welcome Back!

    Login to your account below.

    Lost password?