home.social

#futureofeconomy — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #futureofeconomy, aggregated by home.social.

  1. Why Telegram Must Embed Tor: A Complementary Alliance for Permanent Resilience

    May 15, 2026 marked a turning point. MTProto confirmed mass proxy blockades. The old centralized bypass model has hit a dead end: regulators add IPs to blacklists faster than Telegram can replace them.

    But we saw it coming. On May 14, we published an architectural solution that turns Telegram’s vulnerability into its greatest strength.

    🔻 The Problem: Fragility of Centralization
    Telegram relies on its own servers and proxies. Once identified, access is cut off.
    Tor is resilient but slow and lacks an economic model for scaling.

    🔁 The Solution: Symbiosis, Not Competition
    We propose integrating Tor directly into the Telegram client. This isn’t just a “bridge” — it’s a paradigm shift:
    Tor as Transport: A toggle in settings routes traffic through an unblockable anonymous network.
    Users as Infrastructure: Any Telegram user can voluntarily become a Tor relay.
    TON Economy: Relays earn TON for bandwidth. Smart contracts ensure transparent payouts.
    Update Security (iSE): Nodes stay trusted during updates via the iSE DS registry and TPM. Keys aren’t wiped; the system verifies image hashes against a signed registry.

    🌍 Why It Benefits Everyone
    Telegram: Gets an “unkillable” network. Blocking 900M devices is impossible.
    Tor: Gains a massive user base, sustainable funding, and mainstream adoption.
    You: Get stable, ad-free, surveillance-resistant access.

    🔧 What’s Next?
    We’ve already drafted the integration blueprint and built the iSE registry prototype.
    The engineering solution exists. It now needs technical execution and political will from Telegram and Tor Project.

    We don’t just critique. We build alternatives.
    Stay tuned: a working iSE DS prototype is already live online.

    #Telegram #Tor #Decentralization #TON #Privacy #MTProto #iSE #OpenSource

  2. World of the Apocalypse or Peaceful Apocalypse?

    Kiczynski warned of an "apocalypse of big language models" and removed 138,000 lines of code. So Linux 7.1 was born...

    The developer of Linux network components, Jakub Kiczynski, warned in his merge request about the possible threat associated with the development of large language models. He called this phenomenon the "LLM apocalypse". As a result of this request, 138,000 lines of code were removed from the operating system kernel.

    Linus Torvalds, founder and chief developer of Linux, has approved this solution for inclusion in version 7.1-rc1, which will be released on April 26, 2026. This step is the first time in the history of Linux, when error messages generated by artificial intelligence led to the removal of functioning software.

    Kiczynski argued that some parts of the code that the world had disabled many years ago were still compiled in the kernel. He believed that in order to survive the "apocalypse of big language models", this code should either be transferred to a new owner or deleted.

    The changes made to the core affected six subsystems and included not only the removal of 138,000 rows, but also the introduction of 12,996 new changesets. These network code-related protocols have been permanently removed from the kernel.

    In the coming months, the updated kernel will be installed on all server, mobile and embedded devices running Linux, which underlines the importance of this solution for system security and stability.

    ---

    Linux 7.0: Bash script, weekend and 23 years of fixes

    On April 12, 2026, Linux 7.0 was released, which introduced several significant changes to the development of the operating system. One of the key innovations was the official introduction of the Rust programming language, which is now used in the kernel. In addition, Linux has started using artificial intelligence to analyze and correct errors.

    Linus Torvalds called these changes the new normal, emphasizing their importance for the future development of the system.

    The story behind these changes began with Nicholas Carlini, who spent several months running a bash script on his laptop. This script performed a simple but unusual process: it opened the kernel source code files, passed them to the Claude Opus 4.6 artificial intelligence model, and asked it to find vulnerabilities.

    Carlini did not expect much results, but one day the model discovered a critical vulnerability in the code used to share files over the network. This code was used in a variety of systems, including company file servers, hospital storage devices, school servers, and cloud storage for companies such as AWS, Google Cloud, and Azure.

    The vulnerability was so serious that even an intern who connected to the office Wi-Fi network could run a short script and gain access to the file server. It could read sensitive data, delete important files, and install malware.

    These developments highlighted the need to use artificial intelligence to analyze code and detect vulnerabilities. Linus Torvalds and the Linux developers decided to use this technology to improve the security and stability of the system.
  3. World of the Apocalypse or Peaceful Apocalypse?

    Kiczynski warned of an "apocalypse of big language models" and removed 138,000 lines of code. So Linux 7.1 was born...

    The developer of Linux network components, Jakub Kiczynski, warned in his merge request about the possible threat associated with the development of large language models. He called this phenomenon the "LLM apocalypse". As a result of this request, 138,000 lines of code were removed from the operating system kernel.

    Linus Torvalds, founder and chief developer of Linux, has approved this solution for inclusion in version 7.1-rc1, which will be released on April 26, 2026. This step is the first time in the history of Linux, when error messages generated by artificial intelligence led to the removal of functioning software.

    Kiczynski argued that some parts of the code that the world had disabled many years ago were still compiled in the kernel. He believed that in order to survive the "apocalypse of big language models", this code should either be transferred to a new owner or deleted.

    The changes made to the core affected six subsystems and included not only the removal of 138,000 rows, but also the introduction of 12,996 new changesets. These network code-related protocols have been permanently removed from the kernel.

    In the coming months, the updated kernel will be installed on all server, mobile and embedded devices running Linux, which underlines the importance of this solution for system security and stability.

    ---

    Linux 7.0: Bash script, weekend and 23 years of fixes

    On April 12, 2026, Linux 7.0 was released, which introduced several significant changes to the development of the operating system. One of the key innovations was the official introduction of the Rust programming language, which is now used in the kernel. In addition, Linux has started using artificial intelligence to analyze and correct errors.

    Linus Torvalds called these changes the new normal, emphasizing their importance for the future development of the system.

    The story behind these changes began with Nicholas Carlini, who spent several months running a bash script on his laptop. This script performed a simple but unusual process: it opened the kernel source code files, passed them to the Claude Opus 4.6 artificial intelligence model, and asked it to find vulnerabilities.

    Carlini did not expect much results, but one day the model discovered a critical vulnerability in the code used to share files over the network. This code was used in a variety of systems, including company file servers, hospital storage devices, school servers, and cloud storage for companies such as AWS, Google Cloud, and Azure.

    The vulnerability was so serious that even an intern who connected to the office Wi-Fi network could run a short script and gain access to the file server. It could read sensitive data, delete important files, and install malware.

    These developments highlighted the need to use artificial intelligence to analyze code and detect vulnerabilities. Linus Torvalds and the Linux developers decided to use this technology to improve the security and stability of the system.
  4. World of the Apocalypse or Peaceful Apocalypse?

    Kiczynski warned of an "apocalypse of big language models" and removed 138,000 lines of code. So Linux 7.1 was born...

    The developer of Linux network components, Jakub Kiczynski, warned in his merge request about the possible threat associated with the development of large language models. He called this phenomenon the "LLM apocalypse". As a result of this request, 138,000 lines of code were removed from the operating system kernel.

    Linus Torvalds, founder and chief developer of Linux, has approved this solution for inclusion in version 7.1-rc1, which will be released on April 26, 2026. This step is the first time in the history of Linux, when error messages generated by artificial intelligence led to the removal of functioning software.

    Kiczynski argued that some parts of the code that the world had disabled many years ago were still compiled in the kernel. He believed that in order to survive the "apocalypse of big language models", this code should either be transferred to a new owner or deleted.

    The changes made to the core affected six subsystems and included not only the removal of 138,000 rows, but also the introduction of 12,996 new changesets. These network code-related protocols have been permanently removed from the kernel.

    In the coming months, the updated kernel will be installed on all server, mobile and embedded devices running Linux, which underlines the importance of this solution for system security and stability.

    ---

    Linux 7.0: Bash script, weekend and 23 years of fixes

    On April 12, 2026, Linux 7.0 was released, which introduced several significant changes to the development of the operating system. One of the key innovations was the official introduction of the Rust programming language, which is now used in the kernel. In addition, Linux has started using artificial intelligence to analyze and correct errors.

    Linus Torvalds called these changes the new normal, emphasizing their importance for the future development of the system.

    The story behind these changes began with Nicholas Carlini, who spent several months running a bash script on his laptop. This script performed a simple but unusual process: it opened the kernel source code files, passed them to the Claude Opus 4.6 artificial intelligence model, and asked it to find vulnerabilities.

    Carlini did not expect much results, but one day the model discovered a critical vulnerability in the code used to share files over the network. This code was used in a variety of systems, including company file servers, hospital storage devices, school servers, and cloud storage for companies such as AWS, Google Cloud, and Azure.

    The vulnerability was so serious that even an intern who connected to the office Wi-Fi network could run a short script and gain access to the file server. It could read sensitive data, delete important files, and install malware.

    These developments highlighted the need to use artificial intelligence to analyze code and detect vulnerabilities. Linus Torvalds and the Linux developers decided to use this technology to improve the security and stability of the system.
    Apocalypse for browsers, search engines, and Tor
    
    In the age of artificial intelligence (AI), the software world is on the verge of significant changes. New technologies, such as large language models (LLMs), open up opportunities to automatically search for vulnerabilities in code. This can lead to serious consequences for many projects, including Mozilla, Google, and Tor.
    
    Danger of dead code
    
    Dead code, i.e. code that is no longer used, is a serious threat. It can become a target for attacks that LLMs are now able to detect automatically. Previously, detecting such vulnerabilities required considerable effort and time, but with the advent of LLM, the process has become much faster and more efficient.
    
    Changing the Vulnerability Search economy
    
    Previously, the search for vulnerabilities required the services of expensive experts who conducted a week-long code analysis. Now, just run the script overnight, and LLMs can analyze thousands of lines of code and find potential problems. This changes the economics of finding vulnerabilities and makes them more accessible.
    
    Projects under threat
    
    Many large projects, such as Mozilla, Google, and others, have extensive archives of outdated code. This can become a problem when LLMs start finding vulnerabilities in these codebases. Companies will be forced to either audit and remove dead code, or explain to their users why their data may be compromised due to outdated protocols.
    
    Solution for Tor
    
    This problem is particularly acute for the Tor project. Old relays are not updated due to operators ' concerns about losing their reputation. This makes the network vulnerable to attacks. However, thanks to the new approach proposed by the authors, Tor can quickly and safely update its relays without risking its reputation.
    
    Rapid response mechanism
    
    The authors suggested creating a registry of allowed images, using the TPM as a flag keeper, and binding the identity to the key, not to the content. This will allow operators to quickly and safely update their relays, even if they see that the new image is in the registry. This way, an attack can be closed in hours, not months.
    
    The future of Mozilla, Google, and others
    
    Mozilla, Google, and other companies will also need to review their code. Outdated APIs, experimental features, and drivers for outdated hardware can be a source of vulnerabilities. Companies will be forced to either remove dead code or explain to their users why their data may be compromised due to outdated technologies.
    
    Thor got his chance
    
    While other companies will struggle with the consequences of detecting vulnerabilities, Tor will be able to use the proposed mechanism to quickly and safely update its relays. This will allow the project to remain decentralized and resistant to attacks. Thus, Tor was able not only to survive in the era of the LLM apocalypse, but also to become more secure and efficient.
    
    Conclusion
    
    The authors showed how new technologies can be used to solve old problems. They not only gave Tor the opportunity to change, but also demonstrated how to survive in the era of the LLM apocalypse. This is an important step forward in software development and security.
  5. World of the Apocalypse or Peaceful Apocalypse?

    Kiczynski warned of an "apocalypse of big language models" and removed 138,000 lines of code. So Linux 7.1 was born...

    The developer of Linux network components, Jakub Kiczynski, warned in his merge request about the possible threat associated with the development of large language models. He called this phenomenon the "LLM apocalypse". As a result of this request, 138,000 lines of code were removed from the operating system kernel.

    Linus Torvalds, founder and chief developer of Linux, has approved this solution for inclusion in version 7.1-rc1, which will be released on April 26, 2026. This step is the first time in the history of Linux, when error messages generated by artificial intelligence led to the removal of functioning software.

    Kiczynski argued that some parts of the code that the world had disabled many years ago were still compiled in the kernel. He believed that in order to survive the "apocalypse of big language models", this code should either be transferred to a new owner or deleted.

    The changes made to the core affected six subsystems and included not only the removal of 138,000 rows, but also the introduction of 12,996 new changesets. These network code-related protocols have been permanently removed from the kernel.

    In the coming months, the updated kernel will be installed on all server, mobile and embedded devices running Linux, which underlines the importance of this solution for system security and stability.

    ---

    Linux 7.0: Bash script, weekend and 23 years of fixes

    On April 12, 2026, Linux 7.0 was released, which introduced several significant changes to the development of the operating system. One of the key innovations was the official introduction of the Rust programming language, which is now used in the kernel. In addition, Linux has started using artificial intelligence to analyze and correct errors.

    Linus Torvalds called these changes the new normal, emphasizing their importance for the future development of the system.

    The story behind these changes began with Nicholas Carlini, who spent several months running a bash script on his laptop. This script performed a simple but unusual process: it opened the kernel source code files, passed them to the Claude Opus 4.6 artificial intelligence model, and asked it to find vulnerabilities.

    Carlini did not expect much results, but one day the model discovered a critical vulnerability in the code used to share files over the network. This code was used in a variety of systems, including company file servers, hospital storage devices, school servers, and cloud storage for companies such as AWS, Google Cloud, and Azure.

    The vulnerability was so serious that even an intern who connected to the office Wi-Fi network could run a short script and gain access to the file server. It could read sensitive data, delete important files, and install malware.

    These developments highlighted the need to use artificial intelligence to analyze code and detect vulnerabilities. Linus Torvalds and the Linux developers decided to use this technology to improve the security and stability of the system.
    Apocalypse for browsers, search engines, and Tor
    
    In the age of artificial intelligence (AI), the software world is on the verge of significant changes. New technologies, such as large language models (LLMs), open up opportunities to automatically search for vulnerabilities in code. This can lead to serious consequences for many projects, including Mozilla, Google, and Tor.
    
    Danger of dead code
    
    Dead code, i.e. code that is no longer used, is a serious threat. It can become a target for attacks that LLMs are now able to detect automatically. Previously, detecting such vulnerabilities required considerable effort and time, but with the advent of LLM, the process has become much faster and more efficient.
    
    Changing the Vulnerability Search economy
    
    Previously, the search for vulnerabilities required the services of expensive experts who conducted a week-long code analysis. Now, just run the script overnight, and LLMs can analyze thousands of lines of code and find potential problems. This changes the economics of finding vulnerabilities and makes them more accessible.
    
    Projects under threat
    
    Many large projects, such as Mozilla, Google, and others, have extensive archives of outdated code. This can become a problem when LLMs start finding vulnerabilities in these codebases. Companies will be forced to either audit and remove dead code, or explain to their users why their data may be compromised due to outdated protocols.
    
    Solution for Tor
    
    This problem is particularly acute for the Tor project. Old relays are not updated due to operators ' concerns about losing their reputation. This makes the network vulnerable to attacks. However, thanks to the new approach proposed by the authors, Tor can quickly and safely update its relays without risking its reputation.
    
    Rapid response mechanism
    
    The authors suggested creating a registry of allowed images, using the TPM as a flag keeper, and binding the identity to the key, not to the content. This will allow operators to quickly and safely update their relays, even if they see that the new image is in the registry. This way, an attack can be closed in hours, not months.
    
    The future of Mozilla, Google, and others
    
    Mozilla, Google, and other companies will also need to review their code. Outdated APIs, experimental features, and drivers for outdated hardware can be a source of vulnerabilities. Companies will be forced to either remove dead code or explain to their users why their data may be compromised due to outdated technologies.
    
    Thor got his chance
    
    While other companies will struggle with the consequences of detecting vulnerabilities, Tor will be able to use the proposed mechanism to quickly and safely update its relays. This will allow the project to remain decentralized and resistant to attacks. Thus, Tor was able not only to survive in the era of the LLM apocalypse, but also to become more secure and efficient.
    
    Conclusion
    
    The authors showed how new technologies can be used to solve old problems. They not only gave Tor the opportunity to change, but also demonstrated how to survive in the era of the LLM apocalypse. This is an important step forward in software development and security.
  6. World of the Apocalypse or Peaceful Apocalypse?

    Kiczynski warned of an "apocalypse of big language models" and removed 138,000 lines of code. So Linux 7.1 was born...

    The developer of Linux network components, Jakub Kiczynski, warned in his merge request about the possible threat associated with the development of large language models. He called this phenomenon the "LLM apocalypse". As a result of this request, 138,000 lines of code were removed from the operating system kernel.

    Linus Torvalds, founder and chief developer of Linux, has approved this solution for inclusion in version 7.1-rc1, which will be released on April 26, 2026. This step is the first time in the history of Linux, when error messages generated by artificial intelligence led to the removal of functioning software.

    Kiczynski argued that some parts of the code that the world had disabled many years ago were still compiled in the kernel. He believed that in order to survive the "apocalypse of big language models", this code should either be transferred to a new owner or deleted.

    The changes made to the core affected six subsystems and included not only the removal of 138,000 rows, but also the introduction of 12,996 new changesets. These network code-related protocols have been permanently removed from the kernel.

    In the coming months, the updated kernel will be installed on all server, mobile and embedded devices running Linux, which underlines the importance of this solution for system security and stability.

    ---

    Linux 7.0: Bash script, weekend and 23 years of fixes

    On April 12, 2026, Linux 7.0 was released, which introduced several significant changes to the development of the operating system. One of the key innovations was the official introduction of the Rust programming language, which is now used in the kernel. In addition, Linux has started using artificial intelligence to analyze and correct errors.

    Linus Torvalds called these changes the new normal, emphasizing their importance for the future development of the system.

    The story behind these changes began with Nicholas Carlini, who spent several months running a bash script on his laptop. This script performed a simple but unusual process: it opened the kernel source code files, passed them to the Claude Opus 4.6 artificial intelligence model, and asked it to find vulnerabilities.

    Carlini did not expect much results, but one day the model discovered a critical vulnerability in the code used to share files over the network. This code was used in a variety of systems, including company file servers, hospital storage devices, school servers, and cloud storage for companies such as AWS, Google Cloud, and Azure.

    The vulnerability was so serious that even an intern who connected to the office Wi-Fi network could run a short script and gain access to the file server. It could read sensitive data, delete important files, and install malware.

    These developments highlighted the need to use artificial intelligence to analyze code and detect vulnerabilities. Linus Torvalds and the Linux developers decided to use this technology to improve the security and stability of the system.