Microsoft mulling 128-bit versions of Win8, Win9

Page 7 of 8 FirstFirst ... 5678 LastLast

  1. Posts : 3,639
    Windows 7 Ultimate, OS X 10.7, Ubuntu 11.04
       #60

    sideeffect said:
    The need for 64 bit processors was important to address the increasing RAM usage in modern home computers. We currently have no such problem the limitation is the size of the current modules or amount of RAM slots.
    I think we should encourage staying ahead of things, rather then lagging behind (take Adobe for example, and 64-bit Flash for Windows :)).
      My Computer


  2. Posts : 1,557
    XP, Seven, 2008R2
       #61

    99% of Windows users still run 32-bit Windows because of developers lagging behind. Developers are lagging behind because 99% of Windows users still run 32-bit Windows.

    There's a term for that, but I'm having a mental block right now..
      My Computer


  3. Posts : 51,485
    Windows 11 Workstation x64
       #62

    DarkNovaGamer said:
    Dear god we aren't even fully switched to x64.

    Speak for yourself.....I'm 5 years in.
      My Computers


  4. Posts : 1,557
    XP, Seven, 2008R2
       #63

    Congrats 010

    Now it's time for developers to get on board. I've still got perfectly functioning hardware that doesn't have 64-bit drivers (no, I'm not up for repurchasing new printers, TV capture cards, etcetera just for 64-bit support).
      My Computer


  5. Posts : 333
    Linux (Debian, Android)
       #64

    sup3rsprt said:
    99% of Windows users still run 32-bit Windows because of developers lagging behind. Developers are lagging behind because 99% of Windows users still run 32-bit Windows.

    There's a term for that, but I'm having a mental block right now..
    I believe that is called a catch 22.
      My Computer


  6. Posts : 8,398
    ultimate 64 sp1
       #65

    you sure it isn't catch-32?
      My Computer


  7. Posts : 1,112
    XP_Pro, W7_7201, W7RC.vhd, SciLinux5.3, Fedora12, Fedora9_2x, OpenSolaris_09-06
       #66

    H2SO4 said:
    Antman said:
    Lee said:
    Let me question you all a question: "How many of you are going to turn down a chance to be the first owner of a 128 bit system, and also be the first on the beta list to test the new OS for a 128 bit system.

    I know my name is going to be at the top (if I live that long).:)
    FTW
    I'd like to offer a counterpoint, but only because I'm like that, not because I'm not excited by the thought of bigger and better computers. (I totally agree with Lee, and with Antman's "FTW".)
    I also agree with that. Not that I ever anticipate actually owning a 128-bit microprocessor...

    As Clive Sinclair reportedly retorted when somebody asked him why he settled for the 8-bit Z80 processor in his ZX Spectrum (at a time when 16-bit architectures were becoming available), "because I couldn't find a 4-bit chip I really liked."

    The outwardly facetious comment is in fact insightful.

    Previously, the transition to greater "bittiness" was always forced upon us by simple mathematical exhaustion of the describable memory address range. The 16-bit [segment: offset] addressing scheme was designed to defeat the overly limiting 64KB flat address space. Even so, 1MB became insufficient within a few years.
    I believe someone did say, "640 K is plenty enough for anybody", or similar.

    When the first 80386 PC came out in the mid-80s, the 32-bit machine was capable of a flat 4GB address space. That was sufficient or adequate for around two decades. By the mid-90s, large databases were pushing the envelope, so as a stopgap Intel brought out 36-bit addressing and PAE for a grand total of 64GB. It wasn't uncommon for massive server systems to nudge against that limit by 2000 or a few years later.
    I don't believe you're talking RAM here, but disk storage space, back then.
    64 GB of physical RAM per CPU is still a very large number.
    The Roadrunner petaflop machine only uses, per node,
    16 GB for an IBM Cell/BE memory
    16 GB for an AMD Opteron memory
    and both are 64-bit microprocessors.

    The 64-bit address space is 16 exabytes in size. However, Windows currently artificially limits itself to a mere fraction of that (though that will change in future versions). This time around, we are not even close to exhausting the address space yet, so the talk of double the bittiness is being driven by other factors.

    The number of available general purpose registers, the way they are used, the efficiency of the calling convention... Currently, the potential of a new architecture is more exciting because of the chance to improve in those areas than because of the meaninglessly vast 128-bit address space. Even the 64-bit one we've got now is still almost entirely empty.
    Indeed, but the 128 bit just doesn't describe "Address-space", but also "Data-width", and that's where the 128 is important.
    The DW influences the "Instruction(s)-size" and the "Data-to-be-processed" accuracies.
    AFAIK, the only "other'' processor that deals with 128 bits is the IBM "PowerXCell 8i",
    but the 8 SPE units probably fetch in 64-bit chunks...
    To me, it looks like one amazing architecture, which I've been looking into for a few days now...
    One might find this interesting reading:

    http://www-03.ibm.com/systems/resour...WhitePaper.pdf

    (Following pic indicates a Single-core only. It's a Dual-core chip.)



    Architecturally, it appears (to me) to be similar to having both a CDC-7600 and a Cray XMP 4 on a single microchip,
    8 'functional-units' (SPEs) per each of the dual cores:

    1. 7600, 9 functional-units: CDC 7600s at Livermore
    2. Cray, a vector-machine: Cray XMP

    except that each of the 8 SPE 'functional-units' is not HW bound, but changeable via SW,
    thus being anything you want them to be...

    This appears to be IBM's "updated" (Dbl Prec Ft Pt) version of Antman's earlier posting:

    Antman said:
    It is not really news. Perhaps the bit about the soon to be fired blogger is news, but the tech development is nothing new.

    10 years ago - (Sony) - Welcome to IEEE Xplore 2.0: A microprocessor with a 128-bit CPU, ten floating-point MAC's, fourfloating-point dividers, and an MPEG-2 decoder
    I expect that none of this would be happening yet if Intel could achieve greater uptake of their IA-64 product.
    The 'Itanium' is not particularly well thought of, by one of my favorite columnists:

    How the Itanium Killed the Computer Industry - Columns by PC Magazine
    .

    Last edited by chuckr; 24 Oct 2009 at 06:34.
      My Computer


  8. Posts : 488
    Win 7 Pro x64 x 3, Win 7 Pro x86, Ubuntu 9.04
       #67

    Dream on peeps, 128 bit won't go consumer in the next version of Windows. More than likely only server cpu's will be 128 bit and more than likely that's the only version of Windows that will be 128 bit.
      My Computer


  9. Posts : 554
    Windows 7 Professional x64 SP1
       #68

    fakeasdf said:
    Dream on peeps, 128 bit won't go consumer in the next version of Windows. More than likely only server cpu's will be 128 bit and more than likely that's the only version of Windows that will be 128 bit.
    Then we'll have the same crowd that was so gung-ho on making Windows Server 2003 into a workstation go nuts over a 128-bit Server edition of Windows, trying to make that into a workstation.
      My Computer


  10. Posts : 11
    WinXP Pro
       #69

    Bah. Just get those flash drives larger and cheaper, thats where the real good improvements are.
      My Computer


 
Page 7 of 8 FirstFirst ... 5678 LastLast

  Related Discussions
Our Sites
Site Links
About Us
Windows 7 Forums is an independent web site and has not been authorized, sponsored, or otherwise approved by Microsoft Corporation. "Windows 7" and related materials are trademarks of Microsoft Corp.

© Designer Media Ltd
All times are GMT -5. The time now is 16:17.
Find Us