home.social

#cobol — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #cobol, aggregated by home.social.

  1. Symas will be at Carolina Codes! in August. A second talk on COBOL #cobol #cobolworx and a talk on #openldap. This is a nice regional conference in lovely Greenville, SC.

  2. RE: disabled.social/@vlrny/1165408

    Not voting, as I was born #disabled

    In the #blind community, I was #sighted, although I was born with #cataracts and have always been #LegallyBlind. My dad was blind, and navigated everywhere with his #whitecane. He ws part of an #advocacy #organiztion, and he traveled around the #US by air. His sister, my Aunt Buffy, was a #COBOL #programmer for Sears in #Chicago. She took me to her work once. We rode the #El with her #GuideDog, Ginger.

    This meant that it was tough to get away with "Blind kids can't. . ." unless I'd actually *tried*..

    #RepresentationMatters.

  3. RE: disabled.social/@vlrny/1165408

    Not voting, as I was born #disabled

    In the #blind community, I was #sighted, although I was born with #cataracts and have always been #LegallyBlind. My dad was blind, and navigated everywhere with his #whitecane. He ws part of an #advocacy #organiztion, and he traveled around the #US by air. His sister, my Aunt Buffy, was a #COBOL #programmer for Sears in #Chicago. She took me to her work once. We rode the #El with her #GuideDog, Ginger.

    This meant that it was tough to get away with "Blind kids can't. . ." unless I'd actually *tried*..

    #RepresentationMatters.

  4. RE: disabled.social/@vlrny/1165408

    Not voting, as I was born #disabled

    In the #blind community, I was #sighted, although I was born with #cataracts and have always been #LegallyBlind. My dad was blind, and navigated everywhere with his #whitecane. He ws part of an #advocacy #organiztion, and he traveled around the #US by air. His sister, my Aunt Buffy, was a #COBOL #programmer for Sears in #Chicago. She took me to her work once. We rode the #El with her #GuideDog, Ginger.

    This meant that it was tough to get away with "Blind kids can't. . ." unless I'd actually *tried*..

    #RepresentationMatters.

  5. RE: disabled.social/@vlrny/1165408

    Not voting, as I was born #disabled

    In the #blind community, I was #sighted, although I was born with #cataracts and have always been #LegallyBlind. My dad was blind, and navigated everywhere with his #whitecane. He ws part of an #advocacy #organiztion, and he traveled around the #US by air. His sister, my Aunt Buffy, was a #COBOL #programmer for Sears in #Chicago. She took me to her work once. We rode the #El with her #GuideDog, Ginger.

    This meant that it was tough to get away with "Blind kids can't. . ." unless I'd actually *tried*..

    #RepresentationMatters.

  6. Allein #SVG und #MathML sind relevante Browser-basierte #XML-Technologien. #RSS / #Atom existieren ebenfalls.

    #XSLT lebt auf dem Server weiter und hat in Version 3.0 (bald 4.0) primär für lokal angesiedelte Prozesse der Dokumentverarbeitung (#DITA, #DocBook, #TEI) eine ungebrochene Bedeutung. Zudem steht mit #SaxonJS eine Browser-Alternative zur Verfügung.

    Pensionierte XMLer werden bestimmt ebenso gefragt sein wie im Fall von #COBOL.

  7. Allein #SVG und #MathML sind relevante Browser-basierte #XML-Technologien. #RSS / #Atom existieren ebenfalls.

    #XSLT lebt auf dem Server weiter und hat in Version 3.0 (bald 4.0) primär für lokal angesiedelte Prozesse der Dokumentverarbeitung (#DITA, #DocBook, #TEI) eine ungebrochene Bedeutung. Zudem steht mit #SaxonJS eine Browser-Alternative zur Verfügung.

    Pensionierte XMLer werden bestimmt ebenso gefragt sein wie im Fall von #COBOL.

  8. Allein #SVG und #MathML sind relevante Browser-basierte #XML-Technologien. #RSS / #Atom existieren ebenfalls.

    #XSLT lebt auf dem Server weiter und hat in Version 3.0 (bald 4.0) primär für lokal angesiedelte Prozesse der Dokumentverarbeitung (#DITA, #DocBook, #TEI) eine ungebrochene Bedeutung. Zudem steht mit #SaxonJS eine Browser-Alternative zur Verfügung.

    Pensionierte XMLer werden bestimmt ebenso gefragt sein wie im Fall von #COBOL.

  9. Allein #SVG und #MathML sind relevante Browser-basierte #XML-Technologien. #RSS / #Atom existieren ebenfalls.

    #XSLT lebt auf dem Server weiter und hat in Version 3.0 (bald 4.0) primär für lokal angesiedelte Prozesse der Dokumentverarbeitung (#DITA, #DocBook, #TEI) eine ungebrochene Bedeutung. Zudem steht mit #SaxonJS eine Browser-Alternative zur Verfügung.

    Pensionierte XMLer werden bestimmt ebenso gefragt sein wie im Fall von #COBOL.

  10. Allein #SVG und #MathML sind relevante Browser-basierte #XML-Technologien. #RSS / #Atom existieren ebenfalls.

    #XSLT lebt auf dem Server weiter und hat in Version 3.0 (bald 4.0) primär für lokal angesiedelte Prozesse der Dokumentverarbeitung (#DITA, #DocBook, #TEI) eine ungebrochene Bedeutung. Zudem steht mit #SaxonJS eine Browser-Alternative zur Verfügung.

    Pensionierte XMLer werden bestimmt ebenso gefragt sein wie im Fall von #COBOL.

  11. “Quantum computation is … nothing less than a distinctly new way of harnessing nature”*…

    As the tools in the world around us change, the world– and we– change with them. The onslaught of AI is the change that seems to be grabbing most of our mindshare these days… and with reason. But there are, of course, other changes (in biotech, in materials science, et al.) that are also going to be hugely impactful.

    Today, a look at the computing technology stalking up behind AI: quantum computing. As enthusiasts like David Deutsch (author of the quote above) argue, it can have tremendous benefits, perhaps especially in our ability to model (and thus better understand) our reality.

    But quantum computing will, if/when it arrives, also present huge challenges to us as individuals and as societies– perhaps most prominently in its threat to the ways in which we protect our systems and our information: We’ve felt pretty safe for decades, secure in the knowledge that we could lose passwords to phising or hacks, but that it would take the “classical” computers we have 1 billion years to break today’s RSA-2048 encryption. A quantum computer could crack it in as little as a hundred seconds.

    The technology has been “somewhere on the horizon” for 30 years… so not something that has seemed urgent to confront. But progress has accelerated; a recent Google paper reports on a programming and architectural breakthrough that greatly reduces the computing resources necessary to break classical cryptography… putting the prospect of “Q-Day” (the point at which quantum computers become powerful enough to break standard encryption methods (RSA, ECC), endangering global digital security) much closer, which would put everything from crypto-wallets to our e-banking accounts at risk.

    Charlie Wood brings us up to speed…

    Some 30 years ago, the mathematician Peter Shor took a niche physics project — the dream of building a computer based on the counterintuitive rules of quantum mechanics — and shook the world.

    Shor worked out a way for quantum computers to swiftly solve a couple of math problems that classical computers could complete only after many billions of years. Those two math problems happened to be the ones that secured the then-emerging digital world. The trustworthiness of nearly every website, inbox, and bank account rests on the assumption that these two problems are impossible to solve. Shor’s algorithm proved that assumption wrong.

    For 30 years, Shor’s algorithm has been a security threat in theory only. Physicists initially estimated that they would need a colossal quantum machine with billions of qubits — the elements used in quantum calculations — to run it. That estimate has come down drastically over the years, falling recently to a million qubits. But it has still always sat comfortably beyond the modest capabilities of existing quantum computers, which typically have just hundreds of qubits.

    However, two different groups of researchers have just announced advances that notably reduce the gap between theoretical estimates and real machines. A star-studded team of quantum physicists at the California Institute of Technology went public with a design for a quantum computer that could break encryption with only tens of thousands of qubits and said that it had formed a company to build the machine. And researchers at Google announced that they had developed an implementation of Shor’s algorithm that is ten times as efficient as the best previous method.

    Neither company has the hardware to break encryption today. But the results underscore what some quantum physicists had already come to suspect: that powerful quantum computers may be years away, rather than decades. “If you care about privacy or you have secrets, then you better start looking for alternatives,” said Nikolas Breuckmann, a mathematical physicist at the University of Bristol, who did not work on either of the papers.

    While the new results may provide a jolt for the policymakers and corporations that guard our digital infrastructure, they also signal the rapid progress that physicists have made toward building machines that will let them more thoroughly explore the quantum world.

    “We’re going to actually do this,” said Dolev Bluvstein, a Caltech physicist and CEO of the new company, Oratomic…

    [Wood unpacks the history of the development of the technology and explores the challenges that remain; he concludes…]

    … If any group succeeds at building a quantum computer that can realize Shor’s algorithm, it will mark the end an era — specifically, the “Noisy Intermediate Scale Quantum” era, as Preskill dubbed the pre-error-correction period in a 2018 paper. Each researcher has a vision for what to pursue first with a machine in the new “fault-tolerant” era.

    [Robert] Huang said he would start by running Shor’s algorithm, just to prove that the device works. After that, he said he would try to use it to speed up machine learning — an application to be detailed in coming work.

    Most of the architects building quantum computers, whether at Oratomic or other startups, are physicists at heart. They’re interested in physics, not cryptography. Specifically, they’re interested in all the things a computer fluent in the language of quantum mechanics could teach them about the quantum realm, such as what sort of materials might become superconductors even at warm temperatures. Preskill, for his part, would like to simulate the quantum nature of space-time.

    The Caltech group knows it has years of work ahead before any of its dreams have a chance of coming true. But the researchers can’t wait to get started. “Pick a cooler life quest than building the world’s first quantum computer with your friends!” said a jubilant Bluvstein, reached by phone shortly before their paper went live, before rushing off to celebrate…

    Eminently worth reading in full: “New Advances Bring the Era of Quantum Computers Closer Than Ever,” from @walkingthedot.bsky.social in @quantamagazine.bsky.social.

    * David Deutsch, The Fabric of Reality

    ###

    As we prepare, we might take a moment to appreciate just how vastly and deeply the legacy systems challenged by quantum computing run, recalling that on this date in 1959 Mary Hawes, a computer scientist for the Burroughs Corporation held a meeting of computers users, manufacturers, and academics at the University of Pennsylvania aimed at creating a common business oriented programming language. At the meeting, representative Grace Hopper suggested that they ask the Department of Defense to fund the effort to create such a language. Also attending was Charles Phillips who was director of the Data System Research Staff at the DoD and was excited by the possibility of a common language streamlining their operations. He agreed to sponsor the creation of such a language. This was the genesis of what would eventually become the COBOL language.

    To this day COBOL is still the most common programming language used in business, finance, and administrative systems for companies and governments, primarily on mainframe systems, with around 200 billion lines of code still in production use… all of which are in question and/or at risk in a world of quantum computing.

    source

    #COBOL #computerSecurity #computers #computing #crypto #cryptocurrency #culture #GraceHopper #history #MaryHawes #quantum #quantumComputing #Science #security #Technology
  12. “Quantum computation is … nothing less than a distinctly new way of harnessing nature”*…

    As the tools in the world around us change, the world– and we– change with them. The onslaught of AI is the change that seems to be grabbing most of our mindshare these days… and with reason. But there are, of course, other changes (in biotech, in materials science, et al.) that are also going to be hugely impactful.

    Today, a look at the computing technology stalking up behind AI: quantum computing. As enthusiasts like David Deutsch (author of the quote above) argue, it can have tremendous benefits, perhaps especially in our ability to model (and thus better understand) our reality.

    But quantum computing will, if/when it arrives, also present huge challenges to us as individuals and as societies– perhaps most prominently in its threat to the ways in which we protect our systems and our information: We’ve felt pretty safe for decades, secure in the knowledge that we could lose passwords to phising or hacks, but that it would take the “classical” computers we have 1 billion years to break today’s RSA-2048 encryption. A quantum computer could crack it in as little as a hundred seconds.

    The technology has been “somewhere on the horizon” for 30 years… so not something that has seemed urgent to confront. But progress has accelerated; a recent Google paper reports on a programming and architectural breakthrough that greatly reduces the computing resources necessary to break classical cryptography… putting the prospect of “Q-Day” (the point at which quantum computers become powerful enough to break standard encryption methods (RSA, ECC), endangering global digital security) much closer, which would put everything from crypto-wallets to our e-banking accounts at risk.

    Charlie Wood brings us up to speed…

    Some 30 years ago, the mathematician Peter Shor took a niche physics project — the dream of building a computer based on the counterintuitive rules of quantum mechanics — and shook the world.

    Shor worked out a way for quantum computers to swiftly solve a couple of math problems that classical computers could complete only after many billions of years. Those two math problems happened to be the ones that secured the then-emerging digital world. The trustworthiness of nearly every website, inbox, and bank account rests on the assumption that these two problems are impossible to solve. Shor’s algorithm proved that assumption wrong.

    For 30 years, Shor’s algorithm has been a security threat in theory only. Physicists initially estimated that they would need a colossal quantum machine with billions of qubits — the elements used in quantum calculations — to run it. That estimate has come down drastically over the years, falling recently to a million qubits. But it has still always sat comfortably beyond the modest capabilities of existing quantum computers, which typically have just hundreds of qubits.

    However, two different groups of researchers have just announced advances that notably reduce the gap between theoretical estimates and real machines. A star-studded team of quantum physicists at the California Institute of Technology went public with a design for a quantum computer that could break encryption with only tens of thousands of qubits and said that it had formed a company to build the machine. And researchers at Google announced that they had developed an implementation of Shor’s algorithm that is ten times as efficient as the best previous method.

    Neither company has the hardware to break encryption today. But the results underscore what some quantum physicists had already come to suspect: that powerful quantum computers may be years away, rather than decades. “If you care about privacy or you have secrets, then you better start looking for alternatives,” said Nikolas Breuckmann, a mathematical physicist at the University of Bristol, who did not work on either of the papers.

    While the new results may provide a jolt for the policymakers and corporations that guard our digital infrastructure, they also signal the rapid progress that physicists have made toward building machines that will let them more thoroughly explore the quantum world.

    “We’re going to actually do this,” said Dolev Bluvstein, a Caltech physicist and CEO of the new company, Oratomic…

    [Wood unpacks the history of the development of the technology and explores the challenges that remain; he concludes…]

    … If any group succeeds at building a quantum computer that can realize Shor’s algorithm, it will mark the end an era — specifically, the “Noisy Intermediate Scale Quantum” era, as Preskill dubbed the pre-error-correction period in a 2018 paper. Each researcher has a vision for what to pursue first with a machine in the new “fault-tolerant” era.

    [Robert] Huang said he would start by running Shor’s algorithm, just to prove that the device works. After that, he said he would try to use it to speed up machine learning — an application to be detailed in coming work.

    Most of the architects building quantum computers, whether at Oratomic or other startups, are physicists at heart. They’re interested in physics, not cryptography. Specifically, they’re interested in all the things a computer fluent in the language of quantum mechanics could teach them about the quantum realm, such as what sort of materials might become superconductors even at warm temperatures. Preskill, for his part, would like to simulate the quantum nature of space-time.

    The Caltech group knows it has years of work ahead before any of its dreams have a chance of coming true. But the researchers can’t wait to get started. “Pick a cooler life quest than building the world’s first quantum computer with your friends!” said a jubilant Bluvstein, reached by phone shortly before their paper went live, before rushing off to celebrate…

    Eminently worth reading in full: “New Advances Bring the Era of Quantum Computers Closer Than Ever,” from @walkingthedot.bsky.social in @quantamagazine.bsky.social.

    * David Deutsch, The Fabric of Realityy

    ###

    As we prepare, we might take a moment to appreciate just how vastly and deeply the legacy systems challenged by quantum computing run, recalling that on this date in 1959 Mary Hawes, a computer scientist for the Burroughs Corporation held a meeting of computers users, manufacturers, and academics at the University of Pennsylvania aimed at creating a common business oriented programming language. At the meeting, representative Grace Hopper suggested that they ask the Department of Defense to fund the effort to create such a language. Also attending was Charles Phillips who was director of the Data System Research Staff at the DoD and was excited by the possibility of a common language streamlining their operations. He agreed to sponsor the creation of such a language. This was the genesis of what would eventually become the COBOL language.

    To this day COBOL is still the most common programming language used in business, finance, and administrative systems for companies and governments, primarily on mainframe systems, with around 200 billion lines of code still in production use… all of which are in question and/or at risk in a world of quantum computing.

    source

    #COBOL #computerSecurity #computers #computing #crypto #cryptocurrency #culture #GraceHopper #history #MaryHawes #quantum #quantumComputing #Science #security #Technology
  13. “Quantum computation is … nothing less than a distinctly new way of harnessing nature”*…

    As the tools in the world around us change, the world– and we– change with them. The onslaught of AI is the change that seems to be grabbing most of our mindshare these days… and with reason. But there are, of course, other changes (in biotech, in materials science, et al.) that are also going to be hugely impactful.

    Today, a look at the computing technology stalking up behind AI: quantum computing. As enthusiasts like David Deutsch (author of the quote above) argue, it can have tremendous benefits, perhaps especially in our ability to model (and thus better understand) our reality.

    But quantum computing will, if/when it arrives, also present huge challenges to us as individuals and as societies– perhaps most prominently in its threat to the ways in which we protect our systems and our information: We’ve felt pretty safe for decades, secure in the knowledge that we could lose passwords to phising or hacks, but that it would take the “classical” computers we have 1 billion years to break today’s RSA-2048 encryption. A quantum computer could crack it in as little as a hundred seconds.

    The technology has been “somewhere on the horizon” for 30 years… so not something that has seemed urgent to confront. But progress has accelerated; a recent Google paper reports on a programming and architectural breakthrough that greatly reduces the computing resources necessary to break classical cryptography… putting the prospect of “Q-Day” (the point at which quantum computers become powerful enough to break standard encryption methods (RSA, ECC), endangering global digital security) much closer, which would put everything from crypto-wallets to our e-banking accounts at risk.

    Charlie Wood brings us up to speed…

    Some 30 years ago, the mathematician Peter Shor took a niche physics project — the dream of building a computer based on the counterintuitive rules of quantum mechanics — and shook the world.

    Shor worked out a way for quantum computers to swiftly solve a couple of math problems that classical computers could complete only after many billions of years. Those two math problems happened to be the ones that secured the then-emerging digital world. The trustworthiness of nearly every website, inbox, and bank account rests on the assumption that these two problems are impossible to solve. Shor’s algorithm proved that assumption wrong.

    For 30 years, Shor’s algorithm has been a security threat in theory only. Physicists initially estimated that they would need a colossal quantum machine with billions of qubits — the elements used in quantum calculations — to run it. That estimate has come down drastically over the years, falling recently to a million qubits. But it has still always sat comfortably beyond the modest capabilities of existing quantum computers, which typically have just hundreds of qubits.

    However, two different groups of researchers have just announced advances that notably reduce the gap between theoretical estimates and real machines. A star-studded team of quantum physicists at the California Institute of Technology went public with a design for a quantum computer that could break encryption with only tens of thousands of qubits and said that it had formed a company to build the machine. And researchers at Google announced that they had developed an implementation of Shor’s algorithm that is ten times as efficient as the best previous method.

    Neither company has the hardware to break encryption today. But the results underscore what some quantum physicists had already come to suspect: that powerful quantum computers may be years away, rather than decades. “If you care about privacy or you have secrets, then you better start looking for alternatives,” said Nikolas Breuckmann, a mathematical physicist at the University of Bristol, who did not work on either of the papers.

    While the new results may provide a jolt for the policymakers and corporations that guard our digital infrastructure, they also signal the rapid progress that physicists have made toward building machines that will let them more thoroughly explore the quantum world.

    “We’re going to actually do this,” said Dolev Bluvstein, a Caltech physicist and CEO of the new company, Oratomic…

    [Wood unpacks the history of the development of the technology and explores the challenges that remain; he concludes…]

    … If any group succeeds at building a quantum computer that can realize Shor’s algorithm, it will mark the end an era — specifically, the “Noisy Intermediate Scale Quantum” era, as Preskill dubbed the pre-error-correction period in a 2018 paper. Each researcher has a vision for what to pursue first with a machine in the new “fault-tolerant” era.

    [Robert] Huang said he would start by running Shor’s algorithm, just to prove that the device works. After that, he said he would try to use it to speed up machine learning — an application to be detailed in coming work.

    Most of the architects building quantum computers, whether at Oratomic or other startups, are physicists at heart. They’re interested in physics, not cryptography. Specifically, they’re interested in all the things a computer fluent in the language of quantum mechanics could teach them about the quantum realm, such as what sort of materials might become superconductors even at warm temperatures. Preskill, for his part, would like to simulate the quantum nature of space-time.

    The Caltech group knows it has years of work ahead before any of its dreams have a chance of coming true. But the researchers can’t wait to get started. “Pick a cooler life quest than building the world’s first quantum computer with your friends!” said a jubilant Bluvstein, reached by phone shortly before their paper went live, before rushing off to celebrate…

    Eminently worth reading in full: “New Advances Bring the Era of Quantum Computers Closer Than Ever,” from @walkingthedot.bsky.social in @quantamagazine.bsky.social.

    * David Deutsch, The Fabric of Realityy

    ###

    As we prepare, we might take a moment to appreciate just how vastly and deeply the legacy systems challenged by quantum computing run, recalling that on this date in 1959 Mary Hawes, a computer scientist for the Burroughs Corporation held a meeting of computers users, manufacturers, and academics at the University of Pennsylvania aimed at creating a common business oriented programming language. At the meeting, representative Grace Hopper suggested that they ask the Department of Defense to fund the effort to create such a language. Also attending was Charles Phillips who was director of the Data System Research Staff at the DoD and was excited by the possibility of a common language streamlining their operations. He agreed to sponsor the creation of such a language. This was the genesis of what would eventually become the COBOL language.

    To this day COBOL is still the most common programming language used in business, finance, and administrative systems for companies and governments, primarily on mainframe systems, with around 200 billion lines of code still in production use… all of which are in question and/or at risk in a world of quantum computing.

    source

    #COBOL #computerSecurity #computers #computing #crypto #cryptocurrency #culture #GraceHopper #history #MaryHawes #quantum #quantumComputing #Science #security #Technology
  14. “Quantum computation is … nothing less than a distinctly new way of harnessing nature”*…

    As the tools in the world around us change, the world– and we– change with them. The onslaught of AI is the change that seems to be grabbing most of our mindshare these days… and with reason. But there are, of course, other changes (in biotech, in materials science, et al.) that are also going to be hugely impactful.

    Today, a look at the computing technology stalking up behind AI: quantum computing. As enthusiasts like David Deutsch (author of the quote above) argue, it can have tremendous benefits, perhaps especially in our ability to model (and thus better understand) our reality.

    But quantum computing will, if/when it arrives, also present huge challenges to us as individuals and as societies– perhaps most prominently in its threat to the ways in which we protect our systems and our information: We’ve felt pretty safe for decades, secure in the knowledge that we could lose passwords to phising or hacks, but that it would take the “classical” computers we have 1 billion years to break today’s RSA-2048 encryption. A quantum computer could crack it in as little as a hundred seconds.

    The technology has been “somewhere on the horizon” for 30 years… so not something that has seemed urgent to confront. But progress has accelerated; a recent Google paper reports on a programming and architectural breakthrough that greatly reduces the computing resources necessary to break classical cryptography… putting the prospect of “Q-Day” (the point at which quantum computers become powerful enough to break standard encryption methods (RSA, ECC), endangering global digital security) much closer, which would put everything from crypto-wallets to our e-banking accounts at risk.

    Charlie Wood brings us up to speed…

    Some 30 years ago, the mathematician Peter Shor took a niche physics project — the dream of building a computer based on the counterintuitive rules of quantum mechanics — and shook the world.

    Shor worked out a way for quantum computers to swiftly solve a couple of math problems that classical computers could complete only after many billions of years. Those two math problems happened to be the ones that secured the then-emerging digital world. The trustworthiness of nearly every website, inbox, and bank account rests on the assumption that these two problems are impossible to solve. Shor’s algorithm proved that assumption wrong.

    For 30 years, Shor’s algorithm has been a security threat in theory only. Physicists initially estimated that they would need a colossal quantum machine with billions of qubits — the elements used in quantum calculations — to run it. That estimate has come down drastically over the years, falling recently to a million qubits. But it has still always sat comfortably beyond the modest capabilities of existing quantum computers, which typically have just hundreds of qubits.

    However, two different groups of researchers have just announced advances that notably reduce the gap between theoretical estimates and real machines. A star-studded team of quantum physicists at the California Institute of Technology went public with a design for a quantum computer that could break encryption with only tens of thousands of qubits and said that it had formed a company to build the machine. And researchers at Google announced that they had developed an implementation of Shor’s algorithm that is ten times as efficient as the best previous method.

    Neither company has the hardware to break encryption today. But the results underscore what some quantum physicists had already come to suspect: that powerful quantum computers may be years away, rather than decades. “If you care about privacy or you have secrets, then you better start looking for alternatives,” said Nikolas Breuckmann, a mathematical physicist at the University of Bristol, who did not work on either of the papers.

    While the new results may provide a jolt for the policymakers and corporations that guard our digital infrastructure, they also signal the rapid progress that physicists have made toward building machines that will let them more thoroughly explore the quantum world.

    “We’re going to actually do this,” said Dolev Bluvstein, a Caltech physicist and CEO of the new company, Oratomic…

    [Wood unpacks the history of the development of the technology and explores the challenges that remain; he concludes…]

    … If any group succeeds at building a quantum computer that can realize Shor’s algorithm, it will mark the end an era — specifically, the “Noisy Intermediate Scale Quantum” era, as Preskill dubbed the pre-error-correction period in a 2018 paper. Each researcher has a vision for what to pursue first with a machine in the new “fault-tolerant” era.

    [Robert] Huang said he would start by running Shor’s algorithm, just to prove that the device works. After that, he said he would try to use it to speed up machine learning — an application to be detailed in coming work.

    Most of the architects building quantum computers, whether at Oratomic or other startups, are physicists at heart. They’re interested in physics, not cryptography. Specifically, they’re interested in all the things a computer fluent in the language of quantum mechanics could teach them about the quantum realm, such as what sort of materials might become superconductors even at warm temperatures. Preskill, for his part, would like to simulate the quantum nature of space-time.

    The Caltech group knows it has years of work ahead before any of its dreams have a chance of coming true. But the researchers can’t wait to get started. “Pick a cooler life quest than building the world’s first quantum computer with your friends!” said a jubilant Bluvstein, reached by phone shortly before their paper went live, before rushing off to celebrate…

    Eminently worth reading in full: “New Advances Bring the Era of Quantum Computers Closer Than Ever,” from @walkingthedot.bsky.social in @quantamagazine.bsky.social.

    * David Deutsch, The Fabric of Reality

    ###

    As we prepare, we might take a moment to appreciate just how vastly and deeply the legacy systems challenged by quantum computing run, recalling that on this date in 1959 Mary Hawes, a computer scientist for the Burroughs Corporation held a meeting of computers users, manufacturers, and academics at the University of Pennsylvania aimed at creating a common business oriented programming language. At the meeting, representative Grace Hopper suggested that they ask the Department of Defense to fund the effort to create such a language. Also attending was Charles Phillips who was director of the Data System Research Staff at the DoD and was excited by the possibility of a common language streamlining their operations. He agreed to sponsor the creation of such a language. This was the genesis of what would eventually become the COBOL language.

    To this day COBOL is still the most common programming language used in business, finance, and administrative systems for companies and governments, primarily on mainframe systems, with around 200 billion lines of code still in production use… all of which are in question and/or at risk in a world of quantum computing.

    source

    #COBOL #computerSecurity #computers #computing #crypto #cryptocurrency #culture #GraceHopper #history #MaryHawes #quantum #quantumComputing #Science #security #Technology
  15. #cobol #mainframe programming is insane, ha. Less really because of the language, and more because of the wild environment. This is old, yeah, but it seems very similar to modern ways. #retrocomputing

  16. COBOL on a MacBook? Yes, really.

    I took a legacy COBOL workload and modernized it using Java + the Foreign Function & Memory API (FFM). No mainframe needed.

    This is a practical way to bridge old and new systems without rewriting everything.

    👉 the-main-thread.com/p/cobol-ja

    #Java #Quarkus #COBOL #Modernization #FFM #OpenJDK

  17. #Claude

    (3/n)

    ...zusammenbricht, weil #Claude nun angeblich schon kostengünstig deren #Legacy-Programmiersprache #COBOL für ihre noch immer weitverbreiteten #Mainframe--Arbeitstiere (erstmalig!) kostengünstig optimieren kann, wenn Claude im #US-#Pentagon schon derart tief in Prozesse integriert ist, dass es ein halbes Jahr dauern soll, ihn zu ersetzen & wenn #DonaldTrump gegen die Firma zwecks Offenlegung und Einsatz von #Claude in #KI-Waffensystemen #Anthropic öffentlich medienwirksam wie...

  18. „I deleted my source code“

    joppe.dev/2026/02/26/i-deleted

    „The classic development workflow we all know and loved, is disappearing fast. We will no longer care about the nitty gritty details of how the code works. We will only care that it does.“

    Experiment in #php that has unit tests, some spec markdown files plus skill files to fill the src folder on every push according to specs/tests.

    Old heads cry in #UML, model driven development and flow based whatever. Can't remember all the hyped methodologies. Thought visual driven development. Typed that into Google and found "visual reverse engineering" that uses legacy UIs to guess what a system does and recreates it in "not #COBOL" or "not #Delphi"!?

    Lots of "The Purpose of a System is What it Does" systems out there nowadays, I guess. What do old cyberneticists make of all of this? #posiwid

  19. „I deleted my source code“

    joppe.dev/2026/02/26/i-deleted

    „The classic development workflow we all know and loved, is disappearing fast. We will no longer care about the nitty gritty details of how the code works. We will only care that it does.“

    Experiment in #php that has unit tests, some spec markdown files plus skill files to fill the src folder on every push according to specs/tests.

    Old heads cry in #UML, model driven development and flow based whatever. Can't remember all the hyped methodologies. Thought visual driven development. Typed that into Google and found "visual reverse engineering" that uses legacy UIs to guess what a system does and recreates it in "not #COBOL" or "not #Delphi"!?

    Lots of "The Purpose of a System is What it Does" systems out there nowadays, I guess. What do old cyberneticists make of all of this? #posiwid

  20. „I deleted my source code“

    joppe.dev/2026/02/26/i-deleted

    „The classic development workflow we all know and loved, is disappearing fast. We will no longer care about the nitty gritty details of how the code works. We will only care that it does.“

    Experiment in #php that has unit tests, some spec markdown files plus skill files to fill the src folder on every push according to specs/tests.

    Old heads cry in #UML, model driven development and flow based whatever. Can't remember all the hyped methodologies. Thought visual driven development. Typed that into Google and found "visual reverse engineering" that uses legacy UIs to guess what a system does and recreates it in "not #COBOL" or "not #Delphi"!?

    Lots of "The Purpose of a System is What it Does" systems out there nowadays, I guess. What do old cyberneticists make of all of this? #posiwid

  21. „I deleted my source code“

    joppe.dev/2026/02/26/i-deleted

    „The classic development workflow we all know and loved, is disappearing fast. We will no longer care about the nitty gritty details of how the code works. We will only care that it does.“

    Experiment in #php that has unit tests, some spec markdown files plus skill files to fill the src folder on every push according to specs/tests.

    Old heads cry in #UML, model driven development and flow based whatever. Can't remember all the hyped methodologies. Thought visual driven development. Typed that into Google and found "visual reverse engineering" that uses legacy UIs to guess what a system does and recreates it in "not #COBOL" or "not #Delphi"!?

    Lots of "The Purpose of a System is What it Does" systems out there nowadays, I guess. What do old cyberneticists make of all of this? #posiwid

  22. „I deleted my source code“

    joppe.dev/2026/02/26/i-deleted

    „The classic development workflow we all know and loved, is disappearing fast. We will no longer care about the nitty gritty details of how the code works. We will only care that it does.“

    Experiment in #php that has unit tests, some spec markdown files plus skill files to fill the src folder on every push according to specs/tests.

    Old heads cry in #UML, model driven development and flow based whatever. Can't remember all the hyped methodologies. Thought visual driven development. Typed that into Google and found "visual reverse engineering" that uses legacy UIs to guess what a system does and recreates it in "not #COBOL" or "not #Delphi"!?

    Lots of "The Purpose of a System is What it Does" systems out there nowadays, I guess. What do old cyberneticists make of all of this? #posiwid

  23. With all the COBOL vibe-coding talk going on, I would like to spotlight LDPL (ldpl-lang.org/) which is a COBOL flavoured programming language that runs on modern UNIX-like systems (including MacOS)

    #LDPL #programming #COBOL

  24. I'm considering apostatizing from the church of #emacs. the original allure was, for everything you want to do... "emacs has a package for that." that largely seems to hold up.
    but.

    #doomemacs manages the whole dependency graph for me, which is pretty great. Except when it fails. I made an attempt to learn #cobol for job hunting reasons, so i had to leave emacs to set that up.
    projectile seems to have some weird rules by which it decides what is and is not a project.
    but there's also the other old cliche; "emacs is a fine operating system; it just doesn't have a good text editor." doom emacs lets you use the spacebar instead of ctrl for everything, and on top of that includes evil mode so you use vim motions... Emacs wants to be a text editor; as evidenced by the fact that one never opens a "project", one opens text files. So if all of these other things are extra tools, why not just use vim? the text editor you're mimicking anyway?

    on the other hand, all of these are skill issues. 🤷

  25. IBM’s $40 B loss isn’t about a failed AI miracle – it’s the cost of translating massive COBOL estates. The push for “modernization” turned into a massive code‑migration gamble, even with Watsonx. Find out why legacy code still trumps hype and what it means for enterprise systems. #COBOL #LegacyCode #AITranslation #EnterpriseSystems

    🔗 aidailypost.com/news/ibms-usd-

  26. @amirbkhan @_elena If you stop using billionaire software, just you, you are depriving them of your revenue, attention, and data. Each of us getting off "platforms" (Window, Mac, Facebook, X, etc.) is a member of the "walk away" revolt. And the continuing work to make Free and Open Source Software more accessible to the rest of them takes power away from the oligarchs AND the government they own. We are #OpenLDAP and #GCC #COBOL, two links in the chain.#

  27. Как я мигрировал COBOL-код мейнфрейма на Java: разные подходы и почему ANTLR — лучший выбор

    Когда я работал в одной зарубежной компании, мне поставили задачу мигрировать COBOL‑систему расчета инвойсов с мейнфрейма на Java. Она звучала довольно просто: «Нужно переписать старый COBOL‑код на Java, чтобы система жила дальше». Я тогда подумал, что это будет очередная рутина — взять дремучее легаси, аккуратно переложить его на современную версию Java, возможно, чуть подчистить архитектуру, внести небольшие косметические правки и закрыть задачу. В голове уже созрел рабочий план: пара итераций по автогенерации кода с помощью нейронки и готово. В общем, казалось, что ничего особенного.

    habr.com/ru/articles/980846/

    #туториал #java #cobol #миграция #antlr4

  28. Shift COBOL Batch nach IBM zIIP
    💰 Gigantische Kosteneinsparungen
    🪄 Optimieren Sie 80% Ihrer CP-Workload
    💯 KEINE Änderungen am Quellcode
    🥇 55+ Jahre Mainframe-Erfahrung

    Tauchen Sie tiefer ein in meinem Artikel (Deutsch):
    bit.ly/4nZXwQc
    #cobol #mainframe #zos #batch

  29. With some help from the SZE community, I can now log my service uptime and latency in #Db2 using #Golang !

    I've got a service called CompInvZ which links into my Db2Z host to find services, then it periodically tries to connect and records the result back to DB2

    A friend of mine took it a step further and is using #ISPF to interact with Db2 via #REXX

    That means #CICS is next on the list, thanks #Walmart !
    github.com/walmartlabs/zECS

    #Mainframe #IBM #ZOS #programming #COBOL #HLASM

  30. With some help from the SZE community, I can now log my service uptime and latency in #Db2 using #Golang !

    I've got a service called CompInvZ which links into my Db2Z host to find services, then it periodically tries to connect and records the result back to DB2

    A friend of mine took it a step further and is using #ISPF to interact with Db2 via #REXX

    That means #CICS is next on the list, thanks #Walmart !
    github.com/walmartlabs/zECS

    #Mainframe #IBM #ZOS #programming #COBOL #HLASM

  31. With some help from the SZE community, I can now log my service uptime and latency in #Db2 using #Golang !

    I've got a service called CompInvZ which links into my Db2Z host to find services, then it periodically tries to connect and records the result back to DB2

    A friend of mine took it a step further and is using #ISPF to interact with Db2 via #REXX

    That means #CICS is next on the list, thanks #Walmart !
    github.com/walmartlabs/zECS

    #Mainframe #IBM #ZOS #programming #COBOL #HLASM

  32. With some help from the SZE community, I can now log my service uptime and latency in #Db2 using #Golang !

    I've got a service called CompInvZ which links into my Db2Z host to find services, then it periodically tries to connect and records the result back to DB2

    A friend of mine took it a step further and is using #ISPF to interact with Db2 via #REXX

    That means #CICS is next on the list, thanks #Walmart !
    github.com/walmartlabs/zECS

    #Mainframe #IBM #ZOS #programming #COBOL #HLASM

  33. With some help from the SZE community, I can now log my service uptime and latency in #Db2 using #Golang !

    I've got a service called CompInvZ which links into my Db2Z host to find services, then it periodically tries to connect and records the result back to DB2

    A friend of mine took it a step further and is using #ISPF to interact with Db2 via #REXX

    That means #CICS is next on the list, thanks #Walmart !
    github.com/walmartlabs/zECS

    #Mainframe #IBM #ZOS #programming #COBOL #HLASM

  34. Modula-2, UCSD P-System, and the birth of Scala

    I stumbled across this tidbit from Hacker News.

    I never liked that #Borland stuffs. And used to program in #Pascal in #UCSD P-system (my alma mater). When I got to the US Department of Defense they wanted me for my #C and #COBOL skills and then they sent me to an Air Force School where I studied Modula-2 and Ada.

    I did a lot of work in Modula-2, which doesn't exist anymore. Modula-3 does, but in the meantime Scala was in the works. #Ada is still actually a thing. We didn't want clever, like those one liner #Perl challenges that folks use to put in their signature lines to demonstrate how clever they thought they were through obfuscation.

    Clever is bad. Clever opens up a whole universe of unexpected behavior and potential vulnerabilities. Maybe that's why #Rust became so organically popular - because it's safe by design and nowadays it's included in the Linux kernel more and more.

    When you're designing software for missle guidance systems you most certainly do not want clever. The job is simple and ambiguity is potentially catastrophic in warfare.

    Anyway, I really enjoyed this interview, I can identify with the #Timex_Sinclair - my dad bought me one and that membrane keyboard was horrendous, but I was persistent and eventually I was writing code in cutting edge languages on mainframes and #Vaxen.

    Many of the stories about how one thing or another came about were through frustrations; like the impetus for #Linus the #Linux kernel coz #MINIX just didn't cut it, and who wants to trodge through snow drifts in #Helsinki to the computer lab when you can be warm and cozy, drinking beers in your dorm room?

    This story is kinda like that too, which I can really appreciate, even though I've never played with #Scala.

    I hope you enjoy it too.

    https://www.artima.com/articles/the-origins-of-scala

    #tallship #FOSS #Modula_2 #Modula_3

  35. Modula-2, UCSD P-System, and the birth of Scala

    I stumbled across this tidbit from Hacker News.

    I never liked that #Borland stuffs. And used to program in #Pascal in #UCSD P-system (my alma mater). When I got to the US Department of Defense they wanted me for my #C and #COBOL skills and then they sent me to an Air Force School where I studied Modula-2 and Ada.

    I did a lot of work in Modula-2, which doesn't exist anymore. Modula-3 does, but in the meantime Scala was in the works. #Ada is still actually a thing. We didn't want clever, like those one liner #Perl challenges that folks use to put in their signature lines to demonstrate how clever they thought they were through obfuscation.

    Clever is bad. Clever opens up a whole universe of unexpected behavior and potential vulnerabilities. Maybe that's why #Rust became so organically popular - because it's safe by design and nowadays it's included in the Linux kernel more and more.

    When you're designing software for missle guidance systems you most certainly do not want clever. The job is simple and ambiguity is potentially catastrophic in warfare.

    Anyway, I really enjoyed this interview, I can identify with the #Timex_Sinclair - my dad bought me one and that membrane keyboard was horrendous, but I was persistent and eventually I was writing code in cutting edge languages on mainframes and #Vaxen.

    Many of the stories about how one thing or another came about were through frustrations; like the impetus for #Linus the #Linux kernel coz #MINIX just didn't cut it, and who wants to trodge through snow drifts in #Helsinki to the computer lab when you can be warm and cozy, drinking beers in your dorm room?

    This story is kinda like that too, which I can really appreciate, even though I've never played with #Scala.

    I hope you enjoy it too.

    https://www.artima.com/articles/the-origins-of-scala

    #tallship #FOSS #Modula_2 #Modula_3

  36. Modula-2, UCSD P-System, and the birth of Scala

    I stumbled across this tidbit from Hacker News.

    I never liked that #Borland stuffs. And used to program in #Pascal in #UCSD P-system (my alma mater). When I got to the US Department of Defense they wanted me for my #C and #COBOL skills and then they sent me to an Air Force School where I studied Modula-2 and Ada.

    I did a lot of work in Modula-2, which doesn't exist anymore. Modula-3 does, but in the meantime Scala was in the works. #Ada is still actually a thing. We didn't want clever, like those one liner #Perl challenges that folks use to put in their signature lines to demonstrate how clever they thought they were through obfuscation.

    Clever is bad. Clever opens up a whole universe of unexpected behavior and potential vulnerabilities. Maybe that's why #Rust became so organically popular - because it's safe by design and nowadays it's included in the Linux kernel more and more.

    When you're designing software for missle guidance systems you most certainly do not want clever. The job is simple and ambiguity is potentially catastrophic in warfare.

    Anyway, I really enjoyed this interview, I can identify with the #Timex_Sinclair - my dad bought me one and that membrane keyboard was horrendous, but I was persistent and eventually I was writing code in cutting edge languages on mainframes and #Vaxen.

    Many of the stories about how one thing or another came about were through frustrations; like the impetus for #Linus the #Linux kernel coz #MINIX just didn't cut it, and who wants to trodge through snow drifts in #Helsinki to the computer lab when you can be warm and cozy, drinking beers in your dorm room?

    This story is kinda like that too, which I can really appreciate, even though I've never played with #Scala.

    I hope you enjoy it too.

    https://www.artima.com/articles/the-origins-of-scala

    #tallship #FOSS #Modula_2 #Modula_3

  37. Shift COBOL batch programs to IBM zIIP
    💰 Gigantic cost savings
    🪄 Optimize 80% of your CP workload
    💯 NO Source Code changes
    🥇 55+ years mainframe expertise

    Dive deeper in my new article: bit.ly/43yNYEt
    #cobol #mainframe #zos #batch #offload #optimization

  38. 🚀 Breaking news from the future! #Hypercubic, the #startup you didn't know you needed, is here to save your grandpa's #COBOL #code from the '60s. 🕰️ Because nothing screams cutting-edge #AI like helping you untangle a mess from 1995—bet your grandchildren are thrilled! 😂
    hypercubic.ai/ #Future #Tech #Saving #HackerNews #ngated

  39. Może obecnie #COBOL nie jest już tak żywym tematem w rozmowach informatyków jak kiedyś, ale nadal jest to technologia, która teoretycznie jest wymarła, ale w praktyce... ciągle trzeba ją utrzymywać. A czasem opowiadać sobie o niej niestworzone historie. Jak to z nim jest?

    #programowanie #informatyka

    codeaura.ai/why-cobol-code-sti

  40. 🧓 COBOL still runs the world.
    But what if AI could bridge the gap?

    Instead of rewriting old code, AI agents + experts can reverse-engineer, annotate, and modernize — without touching a single line of COBOL.

    Modernization isn’t starting over.
    It’s starting smarter.

    🔗 Link to the article: buff.ly/qUtykzD