Thread #108047664 | Image & Video Expansion | Click to Play
File: CELL_BE_processor_PS3_board_(cropped).jpg (23.9 KB)
23.9 KB JPG
Why were so many devs filtered by PS3?
It was a complex AF beast to develop for, had a long learning curve, took lots and lots of time and money for people to not be terrible with it... so it wasn't exactly easy to harness its potential.
49 RepliesView Thread
>>
>>108047664
you literally just answered your own question? even gaben was filtered by the CELL but that's just because he was a money grubbing jew that wanted to port his games to as many platforms as quickly as possible. in essence, geen hart voor de zaak
the real filtering is that programmers still have no fucking concept of async job batching, proper sizing of jobs, generic/specific tasks suited to certain types of (sub)processors, and how to unfuck their at-the-time-single-threaded, OOP pointer chasing faggot ass "engines" into DDD shared memory buffers that signaled each other when they were done.
>>
>>
>>
>>
>>
>>
>>
>>
>>
File: bepg.png (227.6 KB)
227.6 KB PNG
>>108048916
Yeah, that's why they shipped a handbook for how to deal with the multithreading, and other things, in a non-retarded way.
>>
>>
>>
>>108047693
>the real filtering is that programmers still have no fucking concept of async job batching, proper sizing of jobs, generic/specific tasks suited to certain types of (sub)processors, and how to unfuck their at-the-time-single-threaded, OOP pointer chasing faggot ass "engines" into DDD shared memory buffers that signaled each other when they were done.
Do you know all of that, smarty pants?
>>
>>
>>108047664
we endured years of stuttery unplayable dx12 titles because developers were too retarded to know they have to precompile shaders now despite immense documentation
there was never any hope for a completely differing architecture for a machine with a lower market share (at the time)
>>
>>108047664
I've seen some docs on PS3 programming and i genuinely wonder if SPUs were used at all in multiplatform titles like CoD(or badly underused like i assume multiplatform RenderWare games on PS2 do).
Even with lower budget PS3 exclusives like Tokyo Jungle or the many Japanese "niche games", is there any way to see SPU usage in RPCS3?
But i think the SPU probably compensated the PPE's flaws no matter how profound, the PS2/3's strengths were their vector units, not the main CPU, despite it being different from the design philosophy of PS1 and the PS4 PS5 that followed.
>>
>>
>>108049651
What warranted this reaction?
Like i get it the Xpoop360 has better framerates than piss3 in CoD and UE3 C++ OOP trash programmed vidya back then but the question remains : if the Piss3 version was optimized to the death would it have btfo the xpoop?
>>
>>
File: 1762354488267485.jpg (532.2 KB)
532.2 KB JPG
reminder that op is a severely autistic niggerfaggot that has been spamming these worthless dogshit threads about the cell cpu for years now
even the filename is similar
>>
>>108049597
seems like most devs just ran it with one core and called it a day not bothering at all with the hardware
irony is that it was still a time where 1st party kept secrets from 3rd parties creating a massive gap in graphics/performance
>>
File: 1762054331974983.jpg (61.9 KB)
61.9 KB JPG
>>108050120
>that last sentence
he warned us, we did nothing
>>
>Make a product in an industry where your product is contingent on an army of third-party programmers who are becoming increasingly overworked and short on time
>Make hardware incompatible with the style of software said third party programmers write
>Don't even bother making an LLVM backend that will attempt to utilize the hardware automatically
It was a retarded move from the get-go. This was obviously an engineering fellow's pet project, one without a single lick of business sense or grasp on the market.
>>
>>108049597
>the PS2/3's strengths were their vector units
The problem is that outside of graphics and to a lesser degree physics, there's not really a whole lot this hardware is useful for.
If they dumped the RnD budget into making a massive cache, deep instruction pipeline and superior branch prediction, they could have cemented themselves as the better console without relying on the good will of studios to write a massive amount of platform dependent code that will influence the architecture of their whole codebase.
>>
>>
>>
>>
>>108048634
This article is based on two misunderstandings, one technical, one cultural.
First is the retarded American mind-plague of 'we can solve problems through *technology words*'. No, OoO execution doesnt solve cache misses. Your typical code can have like 100 instructions in-flight (even that is generous), which can occupy your single OoO core for like 25-30 cycles. A 4GHz core has a memory latency to RAM of about ~1000 cycles. So OoO literally solves fuck-all, it takes you like 4-5% of the way to the perfect magical solution of infinite continuous memory, and yes, CPUs spend most of the time waiting for RAM, even though modern caches are huge (daily reminder, that 5-3-2nm is a BS metric based on packing logic tighter, and cache structures are NOT shrinkable).
This 'flat-memory' model is completely retarded, especially considering, any sort of multi-threaded programming in any programming language on todays computers is flat-out impossible without relying on arcane libraries whose insides contain black magic to synchronize access.
I'm not saying Cell/The Japanese/IBM figured this out, but at least they did the 'return to monke' thing of burning this shit all down and rebuilding it from scratch.
The cultural thing is that the Japanese (and Asians in general) are both more intelligent and hardworking than Westerners. The combination of these 2 attributes means that 'programmer' occupies the same rarity category, and rung on their social ladder, as any semi-skilled worker. This means they have droves of the guys and they can afford to treat them like shit. What does it matter if your architecture is not easily programmable, when you have armies of the guys and you can work them like dogs and pay them peanuts.
>>
>>
>>
>>
>>
>>108052613
>No, OoO execution doesnt solve cache misses
Wrong on two counts:
1. It literally reduces the impact of cache misses thanks to prefetching
2. The rest of OoOE deals with problems that exist regardless of cache misses
3. The SPE doesn't solve cache misses either
The entire reason for the satellite processor setup, rather than having many general purpose cores integrating the same vectorization capability in SIMD instructions, is to try and get around pipeline stalls because the PPE's pipeline was laughably fucking shallow.
>any sort of multi-threaded programming in any programming language on todays computers is flat-out impossible without relying on arcane libraries whose insides contain black magic to synchronize access.
Do you think the cell didn't rely on compiler intrinsics to leverage the SPUs? Also complaining about synchronization primitives is really fucking weird, because synchronizing between SPUs is a nightmare of spu_mfc_getllar/spu_mfc_putllc fine-grained atomic spaghetti.
>but at least they did the 'return to monke' thing of burning this shit all down and rebuilding it from scratch.
And they ended up with a garbage 'solution' that didn't actually address any real problems.
>>
>>
>>108053351
The main thing you have to realize, is if you have a processor like the SPE, which can do floating point/int math and basic branching with 1-2 cycle latencies, and can ONLY write to its local SRAM, the entire CPU can be reduced to like 4-5 pipeline stages and will run at any frequency.
99% of the complexity of CPU designs and pipelines come from it having to interact with a complex memory hierarchy with caches, load/store buffers, in flight instruction buffers etc. while maintaining the illusion you have a CPU that executes code sequentially over a flat memory space.
Now, having only a couple hundred kbyte buffer as your entire memory space, and having to manually copy shit to/from said buffer is a pretty crude solution, and I bet much better ones exist.
But much like OSDev, it's really fucking hard to come up with a decent idea of what the natural evolution of the current thing will look like, while the market insistently demands a slightly better version of the thing you already have.
It seems console hw devs have already given up, and each console is just a PC.
Even what little is there goes unused. Current gen consoles promised infinite open worlds without loading screens, thanks to shit like GPU texture virtual memory, and SSD decompression engines, where you could fill the memory with new scene data in a frame or two, yet there hasn't been a single technically impressive game in the current gen. (Which is all beside the point,)
It was so nice to see Sony actually taking fundamental risks, which is sadly not the case any more.
>>
>>
>>108053644
>yet there hasn't been a single technically impressive game in the current gen
FF7 Rebirth was pretty good in how much area you could cover without loading screens, the third part could be even more so with flying transport, but we will see there.
>>
>>
>>
One thing to keep in mind with the PS3 is that the OS reserved one of the SPUs for it's own background tasks which means when playing a game the PPU and other SPUs weren't having to switch contexts as often as say your desktop CPU running Windows with 50 background processes/programs.
Cache states were much more predictable and the programmer can help the PPU reduce cache misses with hints of upcoming data requests.
>>
>>
>>108053809
>toxic mindset
Not really, more like "practical mindset."
Ken Kutaragi made the PSX and PS2 a massive success, so Sony gave him carte blanch (after telling him his pet project wouldn't take off [which it did, hence...]) to do more magic, unfortunately for Sony Crazy Ken went full retard as MS (and Nintendo, IIRC) started to go "more out-of-the-box programming" than "program for our crazy shit!" like the 1990's was.
People wanted something they could get up and running instantly on, not having to figure out new shit that would maybe not work in a future console.
It's why shit like Unreal Engine became a standard over in-house engines. People don't want to reinvent the wheel each. and. every. time... if they don't have to.
>>
>>108053475
Prefetching is generally useless for games without branch prediction and superscalar execution.
>>108053644
The problem is that in doing so you're sacrificing transparent performance boons for opaque ones. Sony couldn't even be bothered to try and implement transparency in their toolchain and automatically generate SPE utilization in a limited fashion. Hardware transparency wins in the labor-hungry gamedev rice fields, because work that has to be spent on learning and utilizing specific features of your hardware is work that's not spent elsewhere. The less work developers have to do to take advantage of your specific hardware, the less constraints it imposes, then the more your hardware will be taken advantage of.
It's basically a set of design decisions that flagrantly disregard the realities of production. In 2005, AAA games were already pushing close to a million lines of code. It's not just irresponsible, but idiotic to think that anybody but first party developers with huge talent pools are going to take advantage of something like the SPE, because there is literally zero incentive to do so for anybody else. It's not 1985 anymore, a game takes a bit more work than 10k lines of 6502 these days, you know?
And in the end we have to look at what the Cell really gets us, because it's not like fully utilizing it gets us a generational leap. It amounts to things like handling basic cloth simulation a la MGS4.
>yet there hasn't been a single technically impressive game in the current gen
Death Stranding 2 I guess. I haven't played it because I don't own a console. The problem is manifold. Nvidia and engine shops like Unity and Epic have a deathgrasp on talent that would otherwise be working on in-house engines. The AAA industry is currently on fire, and indies can't afford the literal millions of manhours it takes to make a technical marvel. You are right in the implication we've backed ourselves into a corner. Maybe it's time for a crash?
>>
>>
>>
>>
>>