Rendered at 20:36:11 GMT+0000 (Coordinated Universal Time) with Cloudflare Workers.
mattip 2 days ago [-]
PyPy core dev here. If anyone is interested in helping out, either financially or with coding, we can be reached various ways. See https://pypy.org/contact.html
stuaxo 1 days ago [-]
The website should have a prominent Donate section, maybe have some tiers of donation like the Ladybird browser does.
I wanted to put a little £ towards the project but couldn't see a place to do it.
IanCal 1 days ago [-]
I don’t disagree about prominence but to share the links under the about section for people here
Thanks, I should add PyPy to the list of projects I send a little to ... PyPy should be better supported by organisations + not need individual contributions, but things are where they are I guess.
bob_theslob646 21 hours ago [-]
Thanks for sharing this. I just donated. Been using them for many many years.
eugenekolo 1 days ago [-]
Donated. Thank you and everyone else on the PyPy team.
I use PyPy regularly on an app of mine, and very often when I need to do some compute heavy load. Typically over 5x faster than CPython. It makes some stuff that takes impossibly long with CPython (nobody wants to wait 5 minutes...), to returning a response in a few seconds.
Uptrenda 1 days ago [-]
Another suggestion to add for you all (IDK how helpful.) When I see PyPy I see that its speed is faster for CPU-bound work but I'm thinking there is also I/O bound work that would see significant increases in the load they can handle. You could host a page that benchmarks common tasks like HTTP req/s (different types) with asyncio vs CPython. Could even have an automated tool that allows projects to benchmark performance from a web-page using PyPi without having to install or measure anything.
mattip 1 days ago [-]
Benchmarks are tricky. Do you have a specific use case you want sped up?
true_religion 2 hours ago [-]
It would also be nice to see benchmarks of how much faster PyPy is getting each version. I know there is a tracking page but it tracks dozens of tests and has no absolute reference summary by version.
An easy chart to show v3.x is 10% faster than the last version would be great.
forsakenharmony 1 days ago [-]
I have to say the speed comparison on the front page seems hard to read / backwards
I feel like you should either put absolute numbers side by side or how much faster pypy is (instead of how much time it takes)
trueismywork 1 days ago [-]
Also big notice that it is unmaintained
RobotToaster 1 days ago [-]
And that the corporations using their work should donate if they actually want it maintained.
gzread 1 days ago [-]
Donating is for individuals. If a corporation wants something done, they can hire or contract someone to do that thing.
cfbolztereick 2 days ago [-]
PyPy isn't unmaintained. We are certainly fixing bugs and are occasionally improving the jit. However, the remaining core devs (me among them) don't have the capacity to keep up with cpython. So for supporting new cpython versions we'll need new people to step up. For 3.12 this has started, we have a new contributor who is pushing this along.
HaskLwp 1 days ago [-]
CPython has turned into a commercial enterprise where a small number of developers chase away everyone and periodically get useless projects funded by corporations that go nowhere after five years. Intelligent people have all left.
The 150th rewrite of unicodeobject.c is relatively benign (except that it probably costs RedHat money) but the other things are impossible to keep up with.
m000 1 days ago [-]
The text merged to the documentation is more concise than the PR title:
> not actively developed anymore
masklinn 1 days ago [-]
Which is just as wrong.
hyperpape 1 days ago [-]
I think the most significant boundary is given by the question: "is there a plan to support new minor versions of Python?" It sounds like there is not.
There may be non-zero maintenance work happening, but a project that only maintains support for old versions and will never adopt new ones is functionally one that the ecosystem will eventually forget about. Maybe you call that "under active development" but my response is "ok, then I don't care whether it's under active development, I (and 99.9% of other people) should care about whether it's going to support new minor versions."
On the other hand, if you don't support new minor versions day one, but you eventually support them, that's quite different.
crote 1 days ago [-]
More specifically, the Scientific Python community through SPEC 0[0] recommends that support for Python versions is dropped three years after their release. Python 3.12 was released in October 2023[1], so that community is going to drop support for it in October 2026.
Considering that PyPy is only just now starting to seriously work on supporting 3.12, there's a pretty high chance that it won't even be ready for use before becoming obsolete. At that point it doesn't even matter whether you want to call it "in active development", it is simply too far behind to be relevant.
What's the point of a three year window? It seems like a weird middle-point. Either you are in a position to choose/install your own interpreter and libraries or you are not.
If you can choose your own versions and care at all about new releases, you can track latest and greatest with at the very most a few months of lag. Six months of "support" is luxurious in this scenario.
If you can't choose your own versions, you are most likely stuck on some sort of LTS Linux and will need to make do with what they provide. In that case three years is a cruel joke, because almost everything will be more than three years old when it is first deployed in your environment.
jcattle 13 hours ago [-]
I guess the point of a three year window is to be able as an ecosystem to at some point adopt new language features.
When you have some kind of ecosystem rule for that, you can make these upgrade decisions with a lot more confidence.
For example in my project I have a dependency on zstandard. In 3.14 zstandard was added to the standard library. With this ecosystem wide 3 year support cycle I can in good confidence drop the dependency in three years and use the standard lib from then on.
I feel like it just prevents the ecosystem from going stale because some important core library is still supporting a really old version, thus preventing other smaller libraries from using new language features as well, to not exclude a large user base still on an old version.
SiempreViernes 1 days ago [-]
This is silly, there's no killer feature for scientific computing being added to python that would make an existing pypy codebase drop that dependency, getting a code validated takes a long time and dropping something like pypy will require re-valditating the entire thing.
ajb 21 hours ago [-]
Unfortunately python does add features in a drip-drip kind of way that makes being behind an experience with a lot of niggles. This is particularly the case for the type annotation system, which is retrofit to a language that obviously didn't have one originally. So it's being added slowly in a very conservative way, and there are a lot of limitations and pain points that are gradually being improved (or at least progressed on). The upcoming lazy module loading will also immediately become a sticking point.
lmeyerov 1 days ago [-]
The phenomena you're describing is why Cobol programmers still exist, and simultaneously, why it's increasingly irrelevant to most programmers
The killer feature is ecosystem: Easily and reliably reusing other libraries and tools that work out-of-the-box with other Python code written in the last few years . There are individually neato features motivating the efforts involved in upgrading a widely-used language & engine as well, but that kind of thinking misses the forest for the trees unfortunately.
It's a bit surprising to me, in the age of AI coding, for this to be a problem. Most features seem friendly to bootstrapping with automation (ex: f-strings that support ' not just "), and it's interesting if any don't fall in that camp. The main discussion seems to still be framed by the 2024 comments, before Claude Code etc became widespread: https://github.com/orgs/pypy/discussions/5145 .
SiempreViernes 5 hours ago [-]
Most programmers aren't writing scientific software, which you can tell by claims that nicer f-strings is a pressing concern.
cozzyd 1 days ago [-]
The alternative is when you run a script that you last used a few years ago and now need it again for some reason (very common in research) and you might end up spending way too much time making it work with your now upgraded stack.
Sure you can were you should have pinned dependencies but that's a lot of overhead for a random script...
quietbritishjim 1 days ago [-]
They appear to be talking about CPython implementations, taking into account when those versions continue to be sorted (in the sense of security updates). That's irrelevant for PyPy, which clearly supports version numbers on a different schedule.
ameliaquining 1 days ago [-]
It's not irrelevant, because if SPEC 0 says that a particular Python version is no longer supported, then libraries that follow it won't avoid language or standard library features that that version doesn't have. And then those libraries won't work in the corresponding PyPy version. If there isn't a newer PyPy version to upgrade to, then they won't work in PyPy at all.
Wowfunhappy 1 days ago [-]
You might make a different decision if you were targeting PyPy.
masklinn 1 days ago [-]
> I think the most significant boundary is given by the question: "is there a plan to support new minor versions of Python?" It sounds like there is not.
There is literally a Python 3.12 milestone in the bug tracker.
> my response is "ok, then I don't care whether it's under active development, I (and 99.9% of other people) should care about whether it's going to support new minor versions."
It sounds a lot more like your actual response is "I don't care about pypy".
Which is fine, most people don't to start with. You don't have to pretend just to concern-troll the project.
pansa2 2 days ago [-]
PyPy is a fantastic achievement and deserves far more support than it gets. Microsoft’s “Faster CPython” team tried to make Python 5x faster but only achieved ~1.5x in four years - meanwhile PyPy has been running at over 5x faster for decades.
On the other hand, I always got the impression that the main goal of PyPy is to be a research project (on meta-tracing, STM etc) rather than a replacement for CPython in production.
Maybe that, plus the core Python team’s indifference towards non-CPython implementations, is why it doesn’t get the recognition it deserves.
mattip 2 days ago [-]
Third party libraries like SciPy scikit-learn, pandas, tensorflow and pytorch have been critical to python’s success. Since CPython is written in C and exposes a nice C API, those libraries can leverage it to quickly move from (slow) python to (fast) C/C++, hitting an optimum between speed of development and speed of runtime.
PyPy’s alternative, CFFI, was not attractive enough for the big players to adopt. And HPy, another alternative that would have played better with Cython and friends came too late in the game, by that time PyPy development had lost momentum.
toxik 2 days ago [-]
PyPy on numpy heavy code is often a lot slower than CPython
mattip 1 days ago [-]
Yes. The C API those libraries use is a good fit to CPython, a bad fit to PyPy. Hence CFFI and HPy. Actually, many if the lessons from HPy are making their way into CPython since their JIT and speedups face the same problems as PyPy. See https://github.com/py-ni
jjgreen 1 days ago [-]
I rather like Python and have used the C API extensively, "nice" is not the word I'd choose ...
glkindlmann 1 days ago [-]
Sorry can you explain more the connection between PyPy and CFFI (which generates compiled extension modules to wrap an existing C library)? I have never used PyPy, but I use CFFI all the time (to wrap C libraries unrelated to Python so that I can use them from Python)
mattip 1 days ago [-]
CFFI is fast on PyPy. The JIT still cannot peer into the compiled C/C++ code, but it can generate efficient interface code since there is a dedicated _cffi_backend module built into PyPy. Originally that was the motivation for the PyPy developers to create CFFI.
glkindlmann 24 hours ago [-]
Thank you for the background info, and sorry for me explaining CFFI (I just wanted to be sure we were talking about the same thing). Being ignorant about PyPy, I honestly had no idea until now that there was a personnel or purpose overlap between CFFI and PyPy. I am very grateful for CFFI (though I only use it API mode).
pjmlp 1 days ago [-]
Python was already widely deployed before them, thanks to Zope, and being a saner alternative to Perl.
EdNutting 1 days ago [-]
The Faster Python project would’ve got further if Microsoft hadn’t let the entire team go when they made large numbers of their programming languages teams redundant last year. All in the name of “AI”. Microsoft basically gave up on core computer science to go chase the hype wave.
pansa2 1 days ago [-]
You’re right, of course: even Guido seems to have been moved off working on CPython and onto some tangentially-related AI technology.
However, Faster CPython was supposed be a 4-year project, delivering a 1.5x speedup each year. AFAIK they had the full 4 years at Microsoft, and only achieved what they originally planned to do in 1 year.
Qem 1 days ago [-]
To be fair, they suffered a bit from scope creep, as mid project it was started a second major effort to remove the gil. So the codebase was undergoing two major surgeries at the same time. Hard to believe they could stick to the original schedule under those conditions. Also gil removal decreases performance from sequential execution. I imagine some gains from Faster CPython were/will be spent compensating this hit on gil-less single thread performance.
EdNutting 1 days ago [-]
(This affected TypeScript, .NET and other folk too)
pjmlp 1 days ago [-]
See also VC++ now lagging behind ISO, after being the first to achieve C++20.
grzaks 1 days ago [-]
We have been using PyPy on core system component on production for like 10 years
ajross 1 days ago [-]
> PyPy is a fantastic achievement and deserves far more support than it gets
PyPy is a toy for getting great numbers in benchmarks and demos, is incompatible in a zillion critical ways, and is basically useless for large-scale development for anything that has to interoperate with "real" Python.
Literally everyone who's ever tried it has the experience that you mock up a trial for your performance code, drop your jaw in amazement, and then run your whole app and it fails. Until there's a serious attempt at real 100% compatibility, none of this is going to change.
Also none of the deltas are well-documented. My personal journey with PyPy hit a wall when I realized that it's GC is lazy instead of greedy. So a loop that relies on the interpreter to free stuff up (e.g. file descriptors needing to be closed) rapidly runs into resource exhaustion in PyPy. This is huge, easy to trip over, extremely hard to audit, and... it's like it's hidden lore or something. No one tells you this, when it needs to be at the top of their front page before your start the port.
networked 1 days ago [-]
"Ask HN: Is anyone using PyPy for real work?" from 2023 contradicts you about PyPy being a toy. The replies are noticeably biased towards batch jobs (data analysis, ETL, CI), where GC and any other issues affecting long-running processes are less likely to bite, but a few replies talk about sped-up servers as well.
Timely management of external resources is what the `with` statement has been for since 2006, added in python 2.5 or so. To debug these problems Python has Resource Warnings.
Additionally, CPython's gc is also only eager in a best effort kind of way. If cycles are involved it can take long to release memory. This will become even more the case in future versions of CPython, in the free threading variants.
ajross 1 days ago [-]
Sorry, the with statement is non-responsive. The question isn't whether you "can" write PyPy-friendly code. Obviously you can.
The question isn't even whether or not you "should" write PyPy-friendly code, it's whether YOU DID, or your predecessors did. And the answer is "No, they didn't". I mean, duh, as it were.
PyPy isn't compatible. In this way and a thousand tiny others. It's not really "Python" in a measurable and important way. And projects that are making new decisions for what to pick as an implementation language for the evolution of their Python code have, let's be blunt, much better options than PyPy anyway.
aftbit 4 hours ago [-]
Strongly disagree. If you're relying on Python garbage collection to free file descriptors in a loop, you have a subtle bug that will rear its head in unexpected and painful ways (and by some unwritten law of software, most notably either at 3 AM or when you have an important demo scheduled). This is true whether you're running in CPython or PyPy. It's not hard to avoid - use `with` or `try...finally`. It's not some newfangled language feature. It's not a surprise - you can't write good RAAI code in Python. It's a sign of someone with a poor grasp of the language they're using. If you find things like this, you should fix them, even if you never intend to use PyPy.
cozzyd 1 days ago [-]
I've run into similar resource limit exhaustion due to the GC not keeping issues with cpython as well
the_jeremy 2 days ago [-]
If anyone else is also barely aware and confused by the similar names, PyPI is the Python Package Index, which is up and maintained. PyPy is "A fast, compliant alternative implementation of Python." which doesn't have enough devs to release a version for 3.12[0].
Thanks for the clarification. On top of that, being an issue in the 'uv' GitHub repo (uv installs packages from PyPi) made my brain easily cross the letters.
blahgeek 2 days ago [-]
Reminds me of Cython vs CPython
throwaway27448 1 days ago [-]
What is cpython? I don't think I've heard of this one before.
Edit: it's just python. People are pretending like other attempts to implement this are on equal footing
itishappy 1 days ago [-]
CPython (the compiler) is the most popular implementation of Python (the language) like GCC, Clang, and MSVC (compilers) are implementations of C (the language). Other Python implementations include PyPy, Jython, and IronPython.
Nobody is "pretending" anything. These have all been around for 15+ years at this point. Your ignorance does not imply intent to deceive on others part.
em-bee 1 days ago [-]
saying the most popular hides the actual reason why it is popular though. it is the original python implementation. it defines the standard and functions a reference for all others. for better or for worse other implementations have to be bug-compatible with it, and that is what puts them not on equal footing.
for C compilers no reference implementation exists. the C standard was created out of multiple existing implementations.
vector_spaces 1 days ago [-]
PyPy is a JIT-compiled implementation of a language called RPython which is a restricted subset of Python. It does not and has never attempted to implement Python or replace your CPython interpreter for most intents and purposes. CPython is the official reference implementation of the Python language and what you probably use if you write Python code and don't understand the difference between a programming language and its implementations (which is fine)
saila 1 days ago [-]
This doesn't sound right. PyPy has always been described as an alternative implementation of Python that could in some cases be a drop-in replacement for CPython (AKA standard Python) that could speed up production workloads. Underneath that is the RPython toolchain, but that's not what most people are talking about when they talk about PyPy.
paulddraper 1 days ago [-]
The project has self described as CPython for many years.
It’s literally the name of the repo [1].
There’s no grounding to feign surprise or concern anymore.
Moreover, I have used PyPy for years to beat the pants off CPython programs.
pypy existed long before type annotations were a thing
masklinn 1 days ago [-]
And JITs often don't care for type specifications as they can generally get better info from the runtime values, need to support that anyway, and for languages like python the type specifications can be complete lies anyway. They also might support (and substitute) optimised versions of types internally (e.g. pypy has supported list specialisation for a long time).
Maybe it's changed since, but last I checked the JVM's JIT did not care at all for java's types.
Which is not to say JITs don't indirectly benefit mind, type annotations tend to encourage monomorphic code, which JITs do like a lot. But unlike most AOT compilers it's not like they mind that annotations are polymorphic as long as the runtime is monomorphic...
rcxdude 1 days ago [-]
PyPy may not care in principle, but RPython does, being a kind of python dialect designed for static compilation that is intended for writing JIT engines like PyPy.
Muhammad523 2 days ago [-]
Thanks. I knew this already but keep forgetting and getting confused
cjfd 1 days ago [-]
The short summary of it being that these people are beyond terrible at giving names to things.
RobotToaster 1 days ago [-]
Programmers and engineers should never be allowed to name things.
I say that as a programmer and engineer.
chuckadams 1 days ago [-]
"We suck at naming things" -- Bjarne Stroustrup, in a talk about SFINAE
PunchyHamster 1 days ago [-]
On one side I agree. On other side I look how marketing people name things and I think we're still better off
Imagine if next edition of GCC, released in 2026 was named 2027. Then it was GCC One. Then GCC 720. Then GCC XE. Then just plain GCC. Then GCC Teams
mikestew 1 days ago [-]
And then finally…GNU 720 AssistantDriver.
(Tip of the hat to Microsoft’s marketing teams.)
kelvinjps10 1 days ago [-]
The python community has the habit of giving short names for things
with 2 days ago [-]
Thanks, I also saw this as PyPI and was confused, lol
chii 2 days ago [-]
now somebody just needs to make a PiPy for the raspberry pi
zugi 2 days ago [-]
Is that PiPyPy or PiPyPI?
2 days ago [-]
f1shy 2 days ago [-]
Please don’t give ideas
aragilar 2 days ago [-]
Somewhat interesting that "volunteer project no longer under active development" got changed to "unmaintained".
maxloh 2 days ago [-]
For context, they have 2 to 4 commits per month since October [1]. The last release was July 2025 [2].
That seems reasonably active to me. You can't really expect more from an open source project without paid full-time developers.
killingtime74 2 days ago [-]
What euphemism do you prefer then...
aragilar 2 days ago [-]
There's a difference between dead (i.e. "unmaintained") and low activity ("not under active development"). From what I can see PyPy is in the latter category (and being in that category does not mean it's going to die soon), so choosing to claim it is unmaintained is notable.
Hamuko 2 days ago [-]
Being three major versions behind CPython is definitely not a great sign for the long-term viability of it.
stuaxo 1 days ago [-]
It's always been about that many versions behind.
There is more churn in those versions than you'd think.
aragilar 1 days ago [-]
I'd genuinely be curious what fraction of those changes actually requires porting to other Python implementations. The free-threading changes are inherently interpreter specific, so we can ignore those. A significant change in Python 3.12 is dropping "dead batteries", so that can be ignored as well. From what I can see, the main language changes are typing-based (so could have parser implications), and the subinterpreter support being exposed at the Python level (I don't know whether that makes sense for PyPy either). I think this hints that while certain area of Python are undergoing larger changes (e.g. typing, free-threading), there is no obvious missing piece that might drive someone to contribute to PyPy.
Also, looking at the alternate (full) interpreters that have been around a while, PyPy is much more active than either Jython or IronPython. Rust-python seems more active than PyPy, but it's not clear how complete it is (and has going through similar periods of low activity).
Would I personally use PyPy? I'm not planning to, but given how uv is positioning itself, this gives me vibes of youtube stating it will drop IE 6 at some unspecified time in order to kill IE 6 (see https://benjamintseng.com/2024/02/the-ie6-youtube-conspiracy...).
mattip 1 days ago [-]
The problem is the million small paper cuts. The stdlib changes are not all in pure python, many have implications for compiled modules like _ssl. The interpreter changes, especially compatibility with small interpreter changes that are reflected in the dis module, also require work to figure out
saghm 2 days ago [-]
I'm not sure "major versions" is the most correct term here, but I think your point is spot on
OJFord 1 days ago [-]
They are de facto semantic major versions - think of recent-ish additions like f-strings and match-case (3.7 and 3.11, I think), you'd get a syntax error in an older parser. PyPy targeting 3.9 for example would would support f-strings but not match-case.
Or at runtime, you can import things from the standard library which require a minimum 3.x. - .x releases frequently if not always add things, or even change an existing API.
saghm 5 hours ago [-]
> They are de facto semantic major versions - think of recent-ish additions like f-strings and match-case (3.7 and 3.11, I think), you'd get a syntax error in an older parser. PyPy targeting 3.9 for example would would support f-strings but not match-case.
Are you saying that you'd get an error using the new feature on an old version, or that code that used to parse on old versions would not longer work on the newer version? The former is pretty much a textbook example of a minor version update in "traditional" semver; a single new API function is enough to potentially make new code not work on old versions, since any calls to that function will not work on versions predating it. The latter is what would constitute a "breaking change" in semver; old code that used to work can't be made to no longer work without a major version bump.
I say "traditional" semver because in practice it seems like there are fairly few cases in which people actually seem to fully agree on what semver means. I've always found the "official" definition[1] to be extremely clear, but from what I can tell most people don't really adhere to it and have their own personal ideas about what it means. I've proposed things in projects that are fully within both the letter and spirit of semver quite a few times over the years only for people to object on the basis that it "isn't semver" because they hadn't fully read through the description before. Typically I'll mention that what I'm suggesting is something that semver allows and bring up the page and show them the clause that specifically allows what I'm saying but clarify that I recognize we still might not want to do it for other reasons, and the other person will end up preferring to stick with their initial instinct independent of what semver actually says. This is totally fine, as semver is just one of many versioning scheme and not some universal optimum, but my point is that it's probably more confusing for people to use the same term to describe fairly inconsistent things.
True - I don't think I really had my head screwed on there. It just 'feels different' because it's language level, the actual syntax, I suppose, but no - you're right.
Hamuko 2 days ago [-]
For Python, 0.1 increases are major versions and 1.0 increases are cataclysmic shifts.
johndough 1 days ago [-]
I don't know about that. For me, f-strings were the last great quality-of-life improvement that I wouldn't want to live without, and those landed in Python 3.6. Everything after that has not really made much of a difference to me.
dxdm 1 days ago [-]
This reads like you think that "major" version bumps should ony happen when things make a big difference to you personally. At least that's where you land when you follow the logic of your statement. I think you may overrate the importance of your particular use case, and misunderstand what GP meant by "major".
The gist of what GP meant is that Python does not exactly follow SemVer in their numbering scheme, and they treat the middle number more like what would warrant a major (left-most) number increase in SemVer. For example, things will get deprecated and dropped from the standard library, which is a backwards-incompatible change. Middle number changes is also when new features are released, and they get their own "what's new" pages. So on the whole, these middle-number changes feel like "major" releases.
That being said, the Python docs themselves [0] call the left-most number the "major" one, so GP is not technically correct, while I'd say they're right for practical, but easier to misunderstand, purposes.
> A is the major version number – it is only incremented for really major changes in the language.
> B is the minor version number – it is incremented for less earth-shattering changes.
> C is the micro version number – it is incremented for each bugfix release.
> That being said, the Python docs themselves [0] call the left-most number the "major" one, so GP is not technically correct, while I'd say they're right for practical, but easier to misunderstand, purposes.
That's ultimately the point I was trying to make; my inner pedant can't help but feel the need to push back on people using versioning terminology inconsistently, but in practice I don't think it really made much of a difference in this case.
johndough 1 days ago [-]
Oh, you are right, I forgot that "major version" is a technical term and incorrectly read it as "For Python, 0.1 increases make a big difference". My bad!
localuser13 1 days ago [-]
If you want your code to run, you need a python interpreter that supports the newest of your dependencies. You may not use features that came after 3.6 (though you obviously do), but even if just one dependency or sub-depdendency used a python 3.10 specific feature you now need interpreter at least this new.
johndough 1 days ago [-]
That is true, and it is also a huge pet peeve of mine. If more library maintainers showed some restraint it using the newest and hottest features, we'd have much less update churn. But on the other hand, this is what keeps half of us employed, so maybe we should keep at it after all.
toyg 1 days ago [-]
That's like saying the last tax that affected you was passed in 2006...
johndough 1 days ago [-]
I don't understand. Could you elaborate?
fn-mote 1 days ago [-]
It means there are lots of changes in each “minor” version that the poster is ignoring because they are not personally affected.
Match case and even the walrus operator come to mind.
kev009 2 days ago [-]
Undermaintained might be more suited since it does have life but doesn't appear commercially healthy nor apparently relevant to other communities.
dapperdrake 2 days ago [-]
Underphrased like a pro.
electroglyph 2 days ago [-]
much respect to the PyPy contributors, but it seems like a pretty fair assessment
swiftcoder 2 days ago [-]
9 months since the last major release definitely feels like a short time in which to declare time-of-death on an open source project
tempay 2 days ago [-]
It’s been a lot longer than that. There was a reasonable sized effort to provide binaries via conda-forge but the users never came. That said, the PyPy devs were always a pleasure to work with.
We're in March 2026. That's 9 months, which is exactly what GP stated.
> There was a reasonable sized effort to provide binaries via conda-forge but the users never came.
How is that in any way relevant to the maintenance status of pypy?
hobofan 1 days ago [-]
It is also lagging behind in terms of Python releases. They are currently on 3.11, which was released 3.5 years ago for mainline Python.
masklinn 1 days ago [-]
> It is also lagging behind in terms of Python releases.
Which it has always been, especially since Python 3, as anyone who's followed the pypy project in the last decade years is well aware.
crote 1 days ago [-]
The problem is that it is lagging behind enough that it is falling out of the support window for a lot of libraries.
Imagine someone releases RustPy tomorrow, which supports Python 2.7. Is it maintained? Technically, yes - it is just lagging behind a few releases. Should tooling give a big fat warning about it being essentially unusable if you try to use it with the 2026 Python ecosystem? Also yes.
swiftcoder 4 hours ago [-]
3.11 still has 2 years of active security patches, and has most of the modern python ecosystem on tap. That is a whole different ballgame than stuff stuck in the pre-split 2.x world
masklinn 1 days ago [-]
> The problem is that it is lagging behind enough that it is falling out of the support window for a lot of libraries.
Which is a concern for those libraries, I've not seen one thread criticising (or even discussing) numpy's decision.
> Should tooling give a big fat warning about it being essentially unusable if you try to use it with the 2026 Python ecosystem? Also yes.
But it's not, and either way that has nothing to do with uv, it has to do with people who use pypy and the libraries they want to use.
LtWorf 1 days ago [-]
But if you set up dependabot and automerge some crap every couple of days your project will be very active!
Meanwhile my projects got marked as abandoned because those scanners are unaware of codeberg being a thing.
didip 2 days ago [-]
wow, that would be a big shame. I hope many of the useful learnings are already ported to CPython.
mattip 1 days ago [-]
- The pure python repl started off in PyPy, although a lot of work was done to make it ready for prime time by the COython core devs
- The lessons from HPy are slowly making their way into CPython, see https://github.com/py-ni
- There were many fruitful interactions in fixing subtle bugs in CPython that stemmed from testing the stdlib on an alternative implementation
And more
mkl 2 days ago [-]
Almost none of it will have been ported to CPython, as it's a completely different approach.
skissane 2 days ago [-]
I really like PyPy’s approach of using a Python dialect (RPython) as the implementation language, instead of C. From a conceptual perspective, it is much more elegant. And there are other C-like Python dialects now too - Cython, mypy’s mypyc. It would be a shame if PyPy dies.
hrmtst93837 1 days ago [-]
Most pure Python libraries run on PyPy without porting, while incompatibilities come from C extensions written against the CPython C-API such as numpy, lxml and many crypto libraries that either fail or run poorly under PyPy's cpyext compatibility layer.
If you plan to support PyPy, add it to your CI, prefer cffi or pure Python fallbacks over CPython C-API extensions, and be ready to rewrite or vendor performance-critical C extensions because cpyext is slow and incomplete and will waste your debugging time.
scosman 2 days ago [-]
Read as PyPi and almost had heart attack
xvilka 1 days ago [-]
At this point it's probably better investing time and money into RustPython[1][2].
Why would anyone use a python interpreter that is slower than CPython?
1970-01-01 1 days ago [-]
Money is a forcing function for development. Why is there still no way to donate to all devs in the dependency tree? Should we just anticipate expensive problems just like this when the rot finally makes it uncomfortable to continue development?
moktonar 2 days ago [-]
Thank you for all the work guys, I’ll see how I can help.
Imustaskforhelp 2 days ago [-]
@kvinogradov (Open source endowment), I am (Pinging?) you because I think that you may be of help as I remember you stating that within the Open source endowment and the approach of how & which open source projects are better funded[0]
And I think that PyPy might be of interest to the Fund for sponsoring given its close to unmaintained. PyPy is really great in general speeding up Python[1] by magnitudes of order.
Maybe the fund could be of help in order to help paying the maintainer who are underfunded which lead to the situation being unmaintained in the first place. Pinging you because I am interested to hear your response and hopefully, see PyPy having better funding model for its underfunded maintainers.
> @kvinogradov (Open source endowment), I am (Pinging?) you
unfortunately, @-pinging does not work on this site, it does nothing to notify anyone. If you want to get a specific person’s attention, use off-site communication mechanisms
latexr 1 days ago [-]
> unfortunately, @-pinging does not work on this site
I’d call it fortunate, and a feature. Not pinging certainly avoids many discussions becoming too heated too fast between two people and lets other opinions intervene.
pbhjpbhj 1 days ago [-]
There are systems in place to prevent fast back-and-forth arguments.
Not having a mentions functionality for those who wish to use it doesn't seem to to change anything around over-heated discussions.
I'd make @ a page like 'threads' which just includes any comments with @$username.
latexr 1 days ago [-]
> There are systems in place to prevent fast back-and-forth arguments.
Like what? I never saw anything to suggest that is the case.
> Not having a mentions functionality for those who wish to use it doesn't seem to to change anything around over-heated discussions.
Of course it does. If you have to keep checking manually, eventually you’ll get distracted. By the time you come back, if you do, there may already be another reply to the reply and you may no longer feel the need to comment. Nor will you be inclined to respond to a comment made days later in a nested discussion, because you won’t find it. But people just arriving at the thread might, and continue the discussion with new perspectives.
> I'd make @ a page like 'threads' which just includes any comments with @$username.
To each their own, I’m thankful HN doesn’t have that feature.
esafak 1 days ago [-]
> Like what? I never saw anything to suggest that is the case.
You get blocked if you comment too fast.
latexr 1 days ago [-]
That wouldn’t prevent back and forth arguments, the block isn’t fast enough.
pinkmuffinere 2 days ago [-]
HN doesn’t have this sort of pinging behavior :/
doctorpangloss 2 days ago [-]
knowing pypy has good implementations of a lot of behavior it helped me fix multiprocessing in Maya's python interpreter, fixing stuff like torch running inside of Maya.
it's too bad. it is a great project for a million little use cases.
semiinfinitely 1 days ago [-]
my view/experience is that pypy only makes faster the type of python code which you absolutely should not write in python if you care about performance
DemocracyFTW2 1 days ago [-]
> This thread is about PyPy, not PyPI.
The hardest things in programming. That and designing a logo for something you cannot touch, smell or see.
anonnon 2 days ago [-]
Odd how you still see announcements of this nature if Anthropic's marketing is be believed.
jorvi 2 days ago [-]
Yup.
For me the biggest signifier is Spotify. They claim their (best) devs don't even code anymore, they use an internal AI tool that they just send prompts to which then checks out a personal test build that they can download off of Slack. "A new feature in 10 minutes!"
Okay, if that is the case, why have we only seen like 3-4 minor new QoL improvements in Spotify the last ~12 months, with no new grand features? And why haven't they fired 95% of their devs and let the remaining elite go buckwild with Claude?
The Emperor really has no clothes.
stavros 1 days ago [-]
Everyone here says "if developers are so much faster, why aren't we seeing more features?!" as if the only thing required to release a feature is developers.
My CEO keeps asking me "how can we go faster with AI", and my answer is "we can't, because even if we had developers that would instantly develop any feature perfectly, we'd still be bottlenecked on how slow we are at deciding what to actually release".
1 days ago [-]
crote 1 days ago [-]
> why have we only seen like 3-4 minor new QoL improvements
You are seeing improvements? From what I can tell, my user experience has only been going downhill over the past years - even pre-AI...
SCdF 1 days ago [-]
tbf they have been saying they've started doing this since December, so we're only a few months in. And like most software it's an iceberg: 99% of work on not observable by users, and in spotify's case listeners are only one of presumably dozens of different users. For all we know they are shipping massive improvements to eg billing
brodo 1 days ago [-]
Also, why isn‘t there a native client for all platforms? Could they not just let the AI auto-translate the code?
wiseowise 1 days ago [-]
Because believe it or not, majority of users couldn't care less whether it is native or not. I don’t even see Spotify, it’s just something that lives in the background and plays music.
re-thc 1 days ago [-]
> They claim their (best) devs don't even code anymore
No, they claimed they didn’t code during a time period. Around year end until early this year. Technically they could have just been on leave.
Also best dev = principal / staff engineers. They rarely code anyway.
AI or no AI anyone could have made that claim.
QQ00 2 days ago [-]
Anthropic released vibe coded C compiler that doesn't work, how their LLM can help in maintaining PyPy?
networked 1 days ago [-]
Strange subthread. I don't see Claude Opus 4.6 changing the tide for PyPy. There is no need to understate AI capabilities for this.
"Anthropic released vibe coded C compiler that doesn't work" sounds like https://github.com/anthropics/claudes-c-compiler/issues/1 passed through a game of telephone. The compiler has some wrong defaults that prevent it from straightforwardly building a "Hello, world!" like GCC and Clang. The compiler works:
> The 100,000-line compiler can build a bootable Linux 6.9 on x86, ARM, and RISC-V. It can also compile QEMU, FFmpeg, SQlite, postgres, redis, and has a 99% pass rate on most compiler test suites including the GCC torture test suite. It also passes the developer's ultimate litmus test: it can compile and run Doom.
The primary objective is to retarget PyPy on top of the Python main branch. A minor objective is to document what of PyPy can be ported to CPython (or RustPython).
Keep a markdown log of issues in order to cluster and close when fixed
Clone PyPy and CPython.
Review the PyPy codebase and docs.
Prepare a devcontainer.json for PyPy to more safely contain coding LLMs and simplify development
Review the backlog of PyPy issues.
Review the CPython whatsnew docs for each version of python (since and including 3.11).
What has changed in CPython since 3.11 which affects PyPy?
Study the differences between PyPy code and CPython code to understand how to optimize like PyPy.
Prepare an AGENTS.md for PyPy.
Prepare an agent skill for upgrading PyPy with these and other methods.
Write tests to verify that everything in PyPy works after updating it to be compatible with the Python main branch (or the latest stable release, CPython 3.14)
tjpnz 1 days ago [-]
Strikes me as the worst possible solution if they're struggling to find maintainers in the first place. Who reviews the vibe coded patches?
riedel 1 days ago [-]
> Anthropic released vibe coded C compiler that doesn't work, how their LLM can help in maintaining PyPy?
This is the perfect question to highlight the major players. In my opinion, a rapidly developing language with a clear reference implementation, readily accessible specifications, and a vast number of easily runnable tests would make an ideal benchmark.
Hamuko 2 days ago [-]
Most maintainers don't have a stack of cash to throw at tokens.
croddin 2 days ago [-]
They don’t need to throw a stack of cash at them, Anthropic and OpenAI have programs for open source maintainers.
I'd say they're less of "programs" as they are "six-month trials". What's the plan after six months?
And for what's it worth, PyPy isn't even eligible for the Claude trial because they have a meager 1700 stars on GitHub.
ratijas 1 days ago [-]
If number of stars may help projects adopt AI, that makes me reconsider starring projects at all.
cozzyd 24 hours ago [-]
Better to pick projects not hosted on GitHub at all!
blitzar 2 days ago [-]
> What's the plan after six months?
An unmaintainable mass of Ai slop code and the decision to either pay the ai tax or abandon the project.
justinclift 1 days ago [-]
Isn't the Claude one only for a few months?
(I haven't checked the OpenAI one, as I have no interest in them)
simonw 1 days ago [-]
Both programs have been announced as granting six months, but neither of them have explicitly said that there won't be options to renew for another six months.
I expect they haven't decided that themselves yet and don't want to commit publicly until they've seen how well the program goes.
latexr 1 days ago [-]
Even if you’re right, no one should be making a decision of enrolling into those programs because maybe, with zero indication they’ll be renewed again in six months.
You know what they could also do? Stop the programs for new enrolments next month. Or if if they renew them like you said, it could be with new conditions which exclude people currently on them.
There are too many unknowns, and giving these companies the benefit of the doubt that they’ll give more instead of taking more goes counter to everything they showed so far.
simonw 1 days ago [-]
Is your argument here that you shouldn't accept the free trial because you might find it useful and then be trapped into paying for more of it later?
latexr 1 days ago [-]
No, my argument is that your “but neither of them have explicitly said that there won't be options to renew for another six months” point is not something anyone should realistically be counting on, and is not a valid counter argument to your parent post of “Isn't the Claude one only for a few months?”.
We should be discussing what is factual now, not be making up scenarios which could maybe happen but have zero indication that they will.
simonw 1 days ago [-]
I didn't say that I thought they would likely extend it, but I stand by my statement that it's a possibility.
Neither company have expressed that the six month thing is a hard limit.
The fact that OpenAI shipped their version within two weeks of Anthropic's announcement suggests to me that they're competing with each other for credibility with the open source community.
(Obviously if you make decisions based on the assumption that the program will be expanded later you're not acting rationally.)
localuser13 1 days ago [-]
If I understand correctly, they are literally giving things away for free for a 6 months period and we are complaining that they don't promise it stays free forever?
latexr 1 days ago [-]
No, you did not understand correctly. They are not “literally giving things away for free”, they are providing a very conditional free trial, which is a business decision and not anything new. Then a commenter speculated they might extend that program because they didn’t say they won’t and I pointed out it doesn’t make sense to assume they will. No one on this immediate thread made any complaint, we’re discussing the facts of the offering.
dapperdrake 2 days ago [-]
"You're completely right. That mushroom is poisonous."
indubioprorubik 1 days ago [-]
Is this another subversion attack? Basically putting up some subverted package to a established one, that is lazily maintained and then created enough ruckus for the target to switch packages?
1 days ago [-]
markkitti 1 days ago [-]
Is Python dying? /s
shevy-java 2 days ago [-]
What annoys me is the name. Early morning it took me a
moment to realise that PyPy is not PyPi, so at first I
thought they referred to PyPi. Really, just for the name
confusion alone, one of those two should have to go.
Edit: I understand the underlying issue and the PyPy developer's opinion. I don't disagree on that part; I only refer to the name similarity as a problem.
stavros 1 days ago [-]
There is no PyPi, it's PyPI (py pee eye), the Python package index.
dxdm 1 days ago [-]
If you have to insist that a name needs a certain capitalization to properly exist, you're in the territory of brand zealotry and pedantry. The people who don't care for one reason or other vastly outnumber you, and they will invent your disfavored capitalization into existence. The same goes for pronunciation. GIF? Jira?
If your thing can be reached under "pypi.org", you can either accept that people will come up with their own ideas of how to capitalize or pronounce the name, or you can fight against windmills and tell people what ought to exist or not.
puzzledobserver 1 days ago [-]
Wikipedia tells me that the package index PyPI (launched in 2003) is about 4 years older than the interpreter PyPy (first released in 2007).
Still, at its core, PyPy is a Python interpreter which is itself written in Python and the name PyPy fittingly describes its technical design.
scbrg 1 days ago [-]
No. PyPy development was ongoing long before the first release. The first intact commit in the PyPy repo is from February 2003: https://github.com/pypy/pypy/commit/6434e25b53aa307288e5cd8c....
And that commit indicates there's been development going on for a while already. The commit message is:
"Move the pypy trunk into its own top level directory so the path names stay constant."
PyPy migrated from Subversion to git at some point. Not sure how much of the history survived the migration.
aragilar 1 days ago [-]
I think back then PyPI was known as the cheeseshop, so there wouldn't have been the same confusion.
I wanted to put a little £ towards the project but couldn't see a place to do it.
https://pypy.org/howtohelp.html
https://opencollective.com/pypy
I use PyPy regularly on an app of mine, and very often when I need to do some compute heavy load. Typically over 5x faster than CPython. It makes some stuff that takes impossibly long with CPython (nobody wants to wait 5 minutes...), to returning a response in a few seconds.
An easy chart to show v3.x is 10% faster than the last version would be great.
I feel like you should either put absolute numbers side by side or how much faster pypy is (instead of how much time it takes)
The 150th rewrite of unicodeobject.c is relatively benign (except that it probably costs RedHat money) but the other things are impossible to keep up with.
> not actively developed anymore
There may be non-zero maintenance work happening, but a project that only maintains support for old versions and will never adopt new ones is functionally one that the ecosystem will eventually forget about. Maybe you call that "under active development" but my response is "ok, then I don't care whether it's under active development, I (and 99.9% of other people) should care about whether it's going to support new minor versions."
On the other hand, if you don't support new minor versions day one, but you eventually support them, that's quite different.
Considering that PyPy is only just now starting to seriously work on supporting 3.12, there's a pretty high chance that it won't even be ready for use before becoming obsolete. At that point it doesn't even matter whether you want to call it "in active development", it is simply too far behind to be relevant.
[0]: https://scientific-python.org/specs/spec-0000/
[1]: https://www.python.org/downloads/release/python-3120/
If you can choose your own versions and care at all about new releases, you can track latest and greatest with at the very most a few months of lag. Six months of "support" is luxurious in this scenario.
If you can't choose your own versions, you are most likely stuck on some sort of LTS Linux and will need to make do with what they provide. In that case three years is a cruel joke, because almost everything will be more than three years old when it is first deployed in your environment.
When you have some kind of ecosystem rule for that, you can make these upgrade decisions with a lot more confidence.
For example in my project I have a dependency on zstandard. In 3.14 zstandard was added to the standard library. With this ecosystem wide 3 year support cycle I can in good confidence drop the dependency in three years and use the standard lib from then on.
I feel like it just prevents the ecosystem from going stale because some important core library is still supporting a really old version, thus preventing other smaller libraries from using new language features as well, to not exclude a large user base still on an old version.
The killer feature is ecosystem: Easily and reliably reusing other libraries and tools that work out-of-the-box with other Python code written in the last few years . There are individually neato features motivating the efforts involved in upgrading a widely-used language & engine as well, but that kind of thinking misses the forest for the trees unfortunately.
It's a bit surprising to me, in the age of AI coding, for this to be a problem. Most features seem friendly to bootstrapping with automation (ex: f-strings that support ' not just "), and it's interesting if any don't fall in that camp. The main discussion seems to still be framed by the 2024 comments, before Claude Code etc became widespread: https://github.com/orgs/pypy/discussions/5145 .
Sure you can were you should have pinned dependencies but that's a lot of overhead for a random script...
There is literally a Python 3.12 milestone in the bug tracker.
> my response is "ok, then I don't care whether it's under active development, I (and 99.9% of other people) should care about whether it's going to support new minor versions."
It sounds a lot more like your actual response is "I don't care about pypy".
Which is fine, most people don't to start with. You don't have to pretend just to concern-troll the project.
On the other hand, I always got the impression that the main goal of PyPy is to be a research project (on meta-tracing, STM etc) rather than a replacement for CPython in production.
Maybe that, plus the core Python team’s indifference towards non-CPython implementations, is why it doesn’t get the recognition it deserves.
PyPy’s alternative, CFFI, was not attractive enough for the big players to adopt. And HPy, another alternative that would have played better with Cython and friends came too late in the game, by that time PyPy development had lost momentum.
However, Faster CPython was supposed be a 4-year project, delivering a 1.5x speedup each year. AFAIK they had the full 4 years at Microsoft, and only achieved what they originally planned to do in 1 year.
PyPy is a toy for getting great numbers in benchmarks and demos, is incompatible in a zillion critical ways, and is basically useless for large-scale development for anything that has to interoperate with "real" Python.
Literally everyone who's ever tried it has the experience that you mock up a trial for your performance code, drop your jaw in amazement, and then run your whole app and it fails. Until there's a serious attempt at real 100% compatibility, none of this is going to change.
Also none of the deltas are well-documented. My personal journey with PyPy hit a wall when I realized that it's GC is lazy instead of greedy. So a loop that relies on the interpreter to free stuff up (e.g. file descriptors needing to be closed) rapidly runs into resource exhaustion in PyPy. This is huge, easy to trip over, extremely hard to audit, and... it's like it's hidden lore or something. No one tells you this, when it needs to be at the top of their front page before your start the port.
https://news.ycombinator.com/item?id=36940871 (573 points, 181 comments)
Additionally, CPython's gc is also only eager in a best effort kind of way. If cycles are involved it can take long to release memory. This will become even more the case in future versions of CPython, in the free threading variants.
The question isn't even whether or not you "should" write PyPy-friendly code, it's whether YOU DID, or your predecessors did. And the answer is "No, they didn't". I mean, duh, as it were.
PyPy isn't compatible. In this way and a thousand tiny others. It's not really "Python" in a measurable and important way. And projects that are making new decisions for what to pick as an implementation language for the evolution of their Python code have, let's be blunt, much better options than PyPy anyway.
[0]: https://github.com/orgs/pypy/discussions/5145
Edit: it's just python. People are pretending like other attempts to implement this are on equal footing
Nobody is "pretending" anything. These have all been around for 15+ years at this point. Your ignorance does not imply intent to deceive on others part.
for C compilers no reference implementation exists. the C standard was created out of multiple existing implementations.
It’s literally the name of the repo [1].
There’s no grounding to feign surprise or concern anymore.
Moreover, I have used PyPy for years to beat the pants off CPython programs.
[1] https://github.com/python/cpython
Given that both pypy (through RPython) and mypy deal with static type checks in some sense, I kept confusing the two projects until recently.
Also, I just learnt (from another comment in this post) about mypyc [1], which seems to complete the circle somehow in my mind.
https://mypy.readthedocs.io/en/stable/mypy_daemon.html
Maybe it's changed since, but last I checked the JVM's JIT did not care at all for java's types.
Which is not to say JITs don't indirectly benefit mind, type annotations tend to encourage monomorphic code, which JITs do like a lot. But unlike most AOT compilers it's not like they mind that annotations are polymorphic as long as the runtime is monomorphic...
I say that as a programmer and engineer.
Imagine if next edition of GCC, released in 2026 was named 2027. Then it was GCC One. Then GCC 720. Then GCC XE. Then just plain GCC. Then GCC Teams
(Tip of the hat to Microsoft’s marketing teams.)
[1]: https://github.com/pypy/pypy/commits/main/
[2]: https://github.com/pypy/pypy/tags
There is more churn in those versions than you'd think.
Also, looking at the alternate (full) interpreters that have been around a while, PyPy is much more active than either Jython or IronPython. Rust-python seems more active than PyPy, but it's not clear how complete it is (and has going through similar periods of low activity).
Would I personally use PyPy? I'm not planning to, but given how uv is positioning itself, this gives me vibes of youtube stating it will drop IE 6 at some unspecified time in order to kill IE 6 (see https://benjamintseng.com/2024/02/the-ie6-youtube-conspiracy...).
Or at runtime, you can import things from the standard library which require a minimum 3.x. - .x releases frequently if not always add things, or even change an existing API.
Are you saying that you'd get an error using the new feature on an old version, or that code that used to parse on old versions would not longer work on the newer version? The former is pretty much a textbook example of a minor version update in "traditional" semver; a single new API function is enough to potentially make new code not work on old versions, since any calls to that function will not work on versions predating it. The latter is what would constitute a "breaking change" in semver; old code that used to work can't be made to no longer work without a major version bump.
I say "traditional" semver because in practice it seems like there are fairly few cases in which people actually seem to fully agree on what semver means. I've always found the "official" definition[1] to be extremely clear, but from what I can tell most people don't really adhere to it and have their own personal ideas about what it means. I've proposed things in projects that are fully within both the letter and spirit of semver quite a few times over the years only for people to object on the basis that it "isn't semver" because they hadn't fully read through the description before. Typically I'll mention that what I'm suggesting is something that semver allows and bring up the page and show them the clause that specifically allows what I'm saying but clarify that I recognize we still might not want to do it for other reasons, and the other person will end up preferring to stick with their initial instinct independent of what semver actually says. This is totally fine, as semver is just one of many versioning scheme and not some universal optimum, but my point is that it's probably more confusing for people to use the same term to describe fairly inconsistent things.
[1]: https://semver.org/
The gist of what GP meant is that Python does not exactly follow SemVer in their numbering scheme, and they treat the middle number more like what would warrant a major (left-most) number increase in SemVer. For example, things will get deprecated and dropped from the standard library, which is a backwards-incompatible change. Middle number changes is also when new features are released, and they get their own "what's new" pages. So on the whole, these middle-number changes feel like "major" releases.
That being said, the Python docs themselves [0] call the left-most number the "major" one, so GP is not technically correct, while I'd say they're right for practical, but easier to misunderstand, purposes.
> A is the major version number – it is only incremented for really major changes in the language.
> B is the minor version number – it is incremented for less earth-shattering changes.
> C is the micro version number – it is incremented for each bugfix release.
The docs do not seem to mention you, though. :P
[0]: https://docs.python.org/3/faq/general.html#how-does-the-pyth...
That's ultimately the point I was trying to make; my inner pedant can't help but feel the need to push back on people using versioning terminology inconsistently, but in practice I don't think it really made much of a difference in this case.
Match case and even the walrus operator come to mind.
pypy 7.3.20, officially supporting python 3.11, was released in july 2025: https://pypy.org/posts/2025/07/pypy-v7320-release.html
We're in March 2026. That's 9 months, which is exactly what GP stated.
> There was a reasonable sized effort to provide binaries via conda-forge but the users never came.
How is that in any way relevant to the maintenance status of pypy?
Which it has always been, especially since Python 3, as anyone who's followed the pypy project in the last decade years is well aware.
Imagine someone releases RustPy tomorrow, which supports Python 2.7. Is it maintained? Technically, yes - it is just lagging behind a few releases. Should tooling give a big fat warning about it being essentially unusable if you try to use it with the 2026 Python ecosystem? Also yes.
Which is a concern for those libraries, I've not seen one thread criticising (or even discussing) numpy's decision.
> Should tooling give a big fat warning about it being essentially unusable if you try to use it with the 2026 Python ecosystem? Also yes.
But it's not, and either way that has nothing to do with uv, it has to do with people who use pypy and the libraries they want to use.
Meanwhile my projects got marked as abandoned because those scanners are unaware of codeberg being a thing.
And more
If you plan to support PyPy, add it to your CI, prefer cffi or pure Python fallbacks over CPython C-API extensions, and be ready to rewrite or vendor performance-critical C extensions because cpyext is slow and incomplete and will waste your debugging time.
[1] https://rustpython.github.io/
[2] https://github.com/RustPython/RustPython
And I think that PyPy might be of interest to the Fund for sponsoring given its close to unmaintained. PyPy is really great in general speeding up Python[1] by magnitudes of order.
Maybe the fund could be of help in order to help paying the maintainer who are underfunded which lead to the situation being unmaintained in the first place. Pinging you because I am interested to hear your response and hopefully, see PyPy having better funding model for its underfunded maintainers.
[0]: https://endowment.dev/about/#model
[1]: https://benjdd.com/languages2/ (Refer to PyPY and Python difference being ~15x)
unfortunately, @-pinging does not work on this site, it does nothing to notify anyone. If you want to get a specific person’s attention, use off-site communication mechanisms
I’d call it fortunate, and a feature. Not pinging certainly avoids many discussions becoming too heated too fast between two people and lets other opinions intervene.
Not having a mentions functionality for those who wish to use it doesn't seem to to change anything around over-heated discussions.
I'd make @ a page like 'threads' which just includes any comments with @$username.
Like what? I never saw anything to suggest that is the case.
> Not having a mentions functionality for those who wish to use it doesn't seem to to change anything around over-heated discussions.
Of course it does. If you have to keep checking manually, eventually you’ll get distracted. By the time you come back, if you do, there may already be another reply to the reply and you may no longer feel the need to comment. Nor will you be inclined to respond to a comment made days later in a nested discussion, because you won’t find it. But people just arriving at the thread might, and continue the discussion with new perspectives.
> I'd make @ a page like 'threads' which just includes any comments with @$username.
To each their own, I’m thankful HN doesn’t have that feature.
You get blocked if you comment too fast.
it's too bad. it is a great project for a million little use cases.
The hardest things in programming. That and designing a logo for something you cannot touch, smell or see.
For me the biggest signifier is Spotify. They claim their (best) devs don't even code anymore, they use an internal AI tool that they just send prompts to which then checks out a personal test build that they can download off of Slack. "A new feature in 10 minutes!"
Okay, if that is the case, why have we only seen like 3-4 minor new QoL improvements in Spotify the last ~12 months, with no new grand features? And why haven't they fired 95% of their devs and let the remaining elite go buckwild with Claude?
The Emperor really has no clothes.
My CEO keeps asking me "how can we go faster with AI", and my answer is "we can't, because even if we had developers that would instantly develop any feature perfectly, we'd still be bottlenecked on how slow we are at deciding what to actually release".
You are seeing improvements? From what I can tell, my user experience has only been going downhill over the past years - even pre-AI...
No, they claimed they didn’t code during a time period. Around year end until early this year. Technically they could have just been on leave.
Also best dev = principal / staff engineers. They rarely code anyway.
AI or no AI anyone could have made that claim.
"Anthropic released vibe coded C compiler that doesn't work" sounds like https://github.com/anthropics/claudes-c-compiler/issues/1 passed through a game of telephone. The compiler has some wrong defaults that prevent it from straightforwardly building a "Hello, world!" like GCC and Clang. The compiler works:
> The 100,000-line compiler can build a bootable Linux 6.9 on x86, ARM, and RISC-V. It can also compile QEMU, FFmpeg, SQlite, postgres, redis, and has a 99% pass rate on most compiler test suites including the GCC torture test suite. It also passes the developer's ultimate litmus test: it can compile and run Doom.
https://www.anthropic.com/engineering/building-c-compiler
The primary objective is to retarget PyPy on top of the Python main branch. A minor objective is to document what of PyPy can be ported to CPython (or RustPython).
Keep a markdown log of issues in order to cluster and close when fixed
Clone PyPy and CPython.
Review the PyPy codebase and docs.
Prepare a devcontainer.json for PyPy to more safely contain coding LLMs and simplify development
Review the backlog of PyPy issues.
Review the CPython whatsnew docs for each version of python (since and including 3.11).
What has changed in CPython since 3.11 which affects PyPy?
Study the differences between PyPy code and CPython code to understand how to optimize like PyPy.
Prepare an AGENTS.md for PyPy.
Prepare an agent skill for upgrading PyPy with these and other methods.
Write tests to verify that everything in PyPy works after updating it to be compatible with the Python main branch (or the latest stable release, CPython 3.14)
This is the perfect question to highlight the major players. In my opinion, a rapidly developing language with a clear reference implementation, readily accessible specifications, and a vast number of easily runnable tests would make an ideal benchmark.
https://claude.com/contact-sales/claude-for-oss https://openai.com/form/codex-for-oss/
And for what's it worth, PyPy isn't even eligible for the Claude trial because they have a meager 1700 stars on GitHub.
An unmaintainable mass of Ai slop code and the decision to either pay the ai tax or abandon the project.
(I haven't checked the OpenAI one, as I have no interest in them)
I expect they haven't decided that themselves yet and don't want to commit publicly until they've seen how well the program goes.
You know what they could also do? Stop the programs for new enrolments next month. Or if if they renew them like you said, it could be with new conditions which exclude people currently on them.
There are too many unknowns, and giving these companies the benefit of the doubt that they’ll give more instead of taking more goes counter to everything they showed so far.
We should be discussing what is factual now, not be making up scenarios which could maybe happen but have zero indication that they will.
Neither company have expressed that the six month thing is a hard limit.
The fact that OpenAI shipped their version within two weeks of Anthropic's announcement suggests to me that they're competing with each other for credibility with the open source community.
(Obviously if you make decisions based on the assumption that the program will be expanded later you're not acting rationally.)
Edit: I understand the underlying issue and the PyPy developer's opinion. I don't disagree on that part; I only refer to the name similarity as a problem.
If your thing can be reached under "pypi.org", you can either accept that people will come up with their own ideas of how to capitalize or pronounce the name, or you can fight against windmills and tell people what ought to exist or not.
Still, at its core, PyPy is a Python interpreter which is itself written in Python and the name PyPy fittingly describes its technical design.
"Move the pypy trunk into its own top level directory so the path names stay constant."
PyPy migrated from Subversion to git at some point. Not sure how much of the history survived the migration.