Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Sony just spilled confidential Playstation information because of a Sharpie (theverge.com)
118 points by mvdtnz on June 28, 2023 | hide | past | favorite | 102 comments


Looking at the revenues from one game is pretty shocking. Once upon a time I went down a rabbit hole trying to figure out what percentage of a games revenue was made on on PC vs console. I've long since forgotten the source but for GTA V, something like 80%+ of rockstar's revenue was from console.

Every time a game comes out for PC and the community complains about how 'unoptimized' it is, that figure pops in my head. Companies almost always develop to the most popular console, and put in the barest minimum of effort to port it to PC and elsewhere.

On console the ram is shared, and moving data to / from ram can be accomplished via onboard streaming hardware. PC lacks this, so they try to load everything in one shot to avoid studders. That's why you see so many games these days that need 12-16gb of vram. Devs don't bother optimizing for PC as it's simply not worth the effort. This kind of shifts the price / perf curve way out on PC, and a lot of people are unhappy about it, but publishers don't really care anymore.


> Companies almost always develop to the most popular console, and put in the barest minimum of effort to port it to PC and elsewhere.

I am still not over those statements from Microsoft from the first Xbox era and Fable and how it would speed up dev. and how we would get games on both platforms at the same time and then how they didn't release Fable 2 on PC even though they said DirectX blah blah. /rant


>This kind of shifts the price / perf curve way out on PC, and a lot of people are unhappy about it, but publishers don't really care anymore.

Disagree. Publishers care about PC today more than 15-20 years ago.

Back then, there was real risk of the entire PC gaming market disappearing with the exception of FPS, real-time strategy, and 4X. Once FPS (especially online multiplayer) became viable on consoles, and RTS and 4X's shares of the overall market shrank bigtime, the PC market was saved by MMORPGs, then Steam, then indie games, and so on.

Today you have Microsoft finally taking seriously its vow from way back when (as johnchristopher noted) to meld PC and console development and publishing, Sony unexpectedly making a PC game push, and the rising tide from Steam Deck becoming a viable platform benefiting both the mainstream PC and Linux gaming markets.

Consoles remain the largest category as a market, but the PC market compares well with any single console. There's a reason why Madden NFL returned to PC after a decade away.


Considering how much money people drop on gaming PCs these days, why is that sorta hardware advantage not available on PC as well? You'd think there would be a market for it, if offers such an advantage.


As I understand it, you can only get that advantage by having an SoC, where the CPU, RAM and GPU are all one unit (like Apple's M1 chip). If you want a system where to upgrade the RAM, you have to upgrade the whole system - well that already exists, it's called consoles.


The RAM being "on unit" has no impact here. It just requires the CPU & GPU to share an IOMMU & share the memory bus. Whether or not that RAM is soldered or not is totally irrelevant.

Laptops have had unified memory for a ~decade as a result, but once you go to a discreet GPU instead of integrated then there's no feasible way to share the IOMMU & memory bus, at which point it's not practical. And then up-gradable graphics or just 400W GPUs like the 3090/4090 behemoths are incompatible without extreme cost.


> The RAM being "on unit" has no impact here. It just requires the CPU & GPU to share an IOMMU & share the memory bus. Whether or not that RAM is soldered or not is totally irrelevant.

Sure, but to get the same memory bandwidth as a M1 Max you'd need 8-channel DDR5 memory, so your socket would look like Epyc and consume 50W+ just for data movement.

GDDR allows moving much more data than regular DDR (or even LPDDR) but both GDDR and LPDDR5X require much tighter signaling requirements that preclude socketed memory. And that means much higher watts-per-byte-transferred for socketed memory, in practice. Like dozens and dozens of watts higher for a typical configuration.

> Laptops have had unified memory for a ~decade as a result

There's unified memory, and there's unified memory.

Fusion HSA and Intel iGPUs have always required you to designate memory as either graphics or CPU side so that it knows which cache controller to run it through, and this has implications for memory and cache visibility.

ctrl-f 'garlic' and 'onion': https://www.eurogamer.net/digitalfoundry-how-the-crew-was-po...

https://forum.beyond3d.com/threads/amd-kaveri-apu-features-t...

Money shot: "One issue we had was that we had some of our shaders allocated in Garlic but the constant writing code actually had to read something from the shaders to understand what it was meant to be writing - and because that was in Garlic memory, that was a very slow read because it's not going through the CPU caches. That was one issue we had to sort out early on, making sure that everything is split into the correct memory regions otherwise that can really slow you down."

In contrast, the newer stuff like PS5/Xbox Series family and Apple Silicon family has true zero-copy where it's all just flat memory and all writes are immediately visible to all clients of the memory controller. You can interleave GPU and CPU and Neural Engine tasks without forcing explicit sync into memory to flush the cache/bus. I think most people would say that's meaningfully different.

Both are "unified" but one is more unified than the other.


Do you think Apple is working on finding a way to be easier to develop for because of their SOC strategy? Like, hey we are just like a game console, easier than PC!


Mostly because PCs are modular on axes that aren’t that useful today but also can’t easily be changed.

You can’t have the same memory with the same latency used for both CPUs and GPUs because GPUs don’t have DIMM and CPUs do, for example.

You also can’t guarantee a fast enough SSD to be able to delay loading assets until they are needed.


People may drop more money on PC than the average console person drops on their co sole, but the number of people doing this to their PC is vanishingly small. So per capita it’s better, but in aggregate spending a bunch of dev time on such a small market isn’t smart.

Put succinctly: even if this hardware advantage were available on PC, game devs wouldn’t optimize for it.


AMD HSA (which afaik current-gen consoles are based on) is available on PC APUs also. Although it is pretty much completely undocumented mess on software side.


It only works because on modern consoles the RAM is soldered directly on the board and so the data doesn’t have to travel across PCIe lanes. Doing this on a normal PC would have horrific performance. ARM Macs do have this architecture but you pay for it with not being able to replace any individual components for upgrades or even repairs.


you can have sockets/slots for your RAM without it needing to be on the PCIe bus

any PC would be an example


Gaming consoles have worse performance than PCs.


> On console the ram is shared, and moving data to / from ram can be accomplished via onboard streaming hardware. PC lacks this, so they try to load everything in one shot to avoid studders. That's why you see so many games these days that need 12-16gb of vram.

Uh, no? The majority of VRAM data is texture data, and there's absolutely no point or benefit to sharing that with the CPU. So unified vs. not is irrelevant and plays no role.

Also consoles have been (and still absolutely do) stream texture data in regardless. Hell texture pop-in was easily the biggest graphics issue of the outgoing generation of consoles. But even on the current generation 16GB of shared memory is absolutely not enough to contain all the texture & geometry for a modern open world game. It still absolutely streams data, which is why Sony invested so much into texture streaming tech with the PS5 still.


The point of shared memory is that you can skip a whole memcpy. You load from drive into shared RAM and use it immediately. Preferably it's stored on disk in the same format that the GPU can use so you don't have to parse.

Compare that to a PC with a discrete GPU, you have to load the texture/mesh into RAM and (possibly memcpy to a mapped buffer that the GPU can see) and then issue a copy to move it over PCIe to dedicate GPU vram.


That only marginally improves asset streaming performance. Both CPU RAM and a pcie x16 slot have much more bandwidth than any nvme drive will (which is only x4 pcie, after all, even ignoring the limitations of the NAND flash & controller itself) so that extra memcopy isn't significant. And has absolutely no impact whatsoever on general VRAM usage as was claimed anyway.

Also you don't have to load it to system RAM first. DirectStorage exists (kinda)


Well, that and you’re optimizing for a small number of console models vs. a large number of PC configurations.

Further, the piracy situation on PC is much worse (from the developer’s POV) than on consoles.

These factors might explain why developers wouldn’t be chomping at the bit to make well-optimized PC games.


I've been wondering something. Has the PC ever been the majority platform for video games?

It depends on how you define it. I'd define it as having more than half of all game software sales (including DLC, subscriptions, microtransactions, etc.). You could also include hardware spending, but most people bought computers to do other things. You could also use time spent playing games, and in that case, the PC might win because of Solitaire and Minesweeper.

I'd define “PC” as a machine running MS-DOS or Windows. Also, I'd include any machine with a built-in compatibility layer, like the Steam Deck. I wouldn't include other personal computer platforms.

I've been searching the web, but I haven't been able to find the stats on this.


On console the ram is shared, and moving data to / from ram can be accomplished via onboard streaming hardware. PC lacks this

This hasn't been true for years now.


If you're talking about 'direct storage', it's in no way the same. That runs on the gpu, using perf that could otherwise go to rendering (I think DF saw a 10% drop in fps on some games?). Frankly, this isn't going to be fixed until GPUs have dedicated hardware in place, much the same as on console. Maybe that happens with the 5xxx series, we'll have to wait and see.


> something like 80%+ of rockstar's revenue was from console

GTA V had a massive head start on PS3 before it was released on PC, and then re-released on PS4, and then re-re-released on PS5.


I don't think that's sharpie. It's very uniform. In some of the images, it's done per word for adjacent words—who would do that by hand, making all the intricate rounded corners? And in some, the censored areas overlap, and the overlapping area is transparent (like a Venn diagram), which suggests it was done digitally. I don't think it's physically possible to achieve that effect by hand, at least not without being extremely deliberate.

Which begs the question, if this was done digitally, did nobody look at the result? Nobody saw that the redacted text was easily still readable?

From the look of it, it's like they used a tool for highlighting, with the color set to something dark.

Without being too much of a conspiracy theorist, it almost seems like this was an intentional leak. Hanlon's razor and everything, but it wouldn't really be "malice" if they were trying to either sow false information or reveal details that they wanted to make seem like they were trying to keep private.


I wonder if they did a black highlight digitally and then printed that, not taking the time to verify that the black was dark enough that it was actually making the text under it unrecoverable.


I too have noticed all sorts of legal filings being blackened out ineffectively. And majority of the time it is not apparent unless you on a very bright monitor and zoom into the text. (I never said anything because I didn’t want anyone to feel embarrassed that the data they were trying to protect didn’t stay secret). I always wondered what happens when my lawyers do the same thing.

When scanners shine green light into a page, the kinds of details they reveal is extraordinary. I have been able to recover hidden text this way. The same thing happened when I took a camera snapshot of a page. And used filters to make it gray. Hidden text security feature pops out.


In 2021, 1 million playstation owners spent 100% of their playtime in Call of Duty


It reminds me of when I'd buy used PS2 memory cards and find them full of nothing but different versions of the exact same sports titles. As someone who plays a ton of different games in very different genres I can't imagine never branching out.

If any of those 1 million call of duty players are here. I want to tell you that you're missing out.


As someone who usually plays only a very small amount of games for a very long time (MMOs and some others), what I noticed and also heard from others it's not even that other games are not interesting or good, it's more like "I'm trying this new game and I'd say I'm having fun like 7/10, but I'd really rather be playing the game I just put down which would be like a 9/10". Not sure if that makes sense or applies to those people, if they really mean "100%" or more like "we saw them trying another game for 1h in a month and then they played 10h of CoD per week for 2 months" or even people not buying, but trying out other games at friends' places and then not buying them.


That makes sense to me. Multiplayer games and MMOs in particular are satisfying in ways that a lot of other genres really aren't. A single player game might be fun, but your friends aren't there, and you have don't have the years of investment in your character and their progression. You also don't usually get the guilds and clans and forums and spreadsheets that can keep you engaged even out of the game.

I recently finished a Final Fantasy game after more than 300 hours, so I can spend a lot of time on a single title myself and I absolutely have a few games I keep going back to when I want a break from the backlog. I think as long as people are trying out new things it's fine to fall back on familiar favorites. My worry was that these people were 100% CoD and nothing else which seems very strange and self-limiting.


Those sports games always took up a ton of memory to save. My first GC card had like 59 blocks, and Madden 2002 wanted all of them. Might explain a little bit of what you're seeing.


When I was a kid, I had like 3 games. Crash Bandicoot, FIFA and THPS. Those, along with a new demo disc each month, was all I needed.


Kids had it rough. Especially when they were totally dependent on their parents for new titles. Today, kids with a real computer have countless games available plus whole ton of crappy f2p mobile gaming options to keep them going between birthdays and christmas.


Even consoles have a decent selection of free games these days, stuff like Fortnite, Overwatch 2, Sims 4, Yugioh Master Duel, etc.


> If any of those 1 million call of duty players are here. I want to tell you that you're missing out.

I figure that all the other games are too hard for them.


Some do involve an awful lot of reading...


This really explains why I get totally destroyed at Call Of Duty since I spent about an hour a year playing it.


When I got my first full-time job, for about 6 months, I spent basically all my time when I wasn't working or sleeping just playing CoD. I think my ADHD enabled it. I got really good, really fast, was in something like top 5% of players. Then I got into music and mostly stopped gaming


When I was 18 I used to be crazy good at EA's NHL. Sadly it was before internet gaming or I would have been super awesome. I was spending 8-10 hours a day playing, beating the computer at the hardest level 20-0 with 2-minute periods. No one I knew would play me. It's such a shame that when I really needed internet gaming it wasn't a thing and now it's a thing I like story games.


Have you recently tried to play any other old FPSes? Quake, Quake II, or even Unreal Tournament 99.

If you manage to find a server, and active players, you can tell those players have been at it since 1999, and it takes all of the fun out of it.


But also

> Call of Duty players spending more than 70 percent of their time on Call of Duty spent an average of 296 hours on the franchise

That’s less than an hour per day, and that includes the actual hardcore CoD nolifers. Somehow I think that “people who spend 100 % of their playtime on CoD” are most likely some busy parents who barely manage to squeeze a few rounds of CoD with friends into their week.


I’m one of these people. It's a common game for nearly all of my friends. I use it as time to catch up with people and hang out. It’s also the game I’m best at, so there’s little reason for me to play anything else.


Warzone was free to play and actually very good at that time. I can see why.


How on earth does a game make sixteen BILLION dollars annually off a game? That's an absurd number. And that's on one system.

You'd think with all that money there would be more competitors with much larger budgets.


You're misreading the text. They're not claiming that CoD makes that amount of money per year. They're saying that people who played even a minute of CoD spent that much across the PS ecosystem as a whole.


It's a very risky industry. It's quite likely to sink a ton of money into a game and earn very little back, basically wasting everything. Game series that consistently make revenue are even rarer so they're extremely coddled while they last, though it is expected that eventually they will die, too. The rare successes cover the costs of the common failures.


Destiny from Bungie makes billions, but that's largely from the in game cash shop.


Destiny does not make billions. The estimates when Sony bought them were on the order of $200M/year.


That's a lot of DLC.


> The Last of Us Part II cost $220 million with around 200 employees:

That just seems like crazy high figure. One of the things that is kind annoying me with games these days is how long it takes them to build the next one. Edler Scrolls is about 5-6 years away and that probably means Fallout is 10 years away. Sure we'll have Starfield but decades between games in a series is just bonkers.


$220 million is not unheard of, particularly for a game that's intended to sell consoles. GTA 5 had a similarly sized budget.

https://www.dualshockers.com/biggest-video-game-budgets/

https://en.wikipedia.org/wiki/List_of_most_expensive_video_g...

As for waiting years between installments, we have plenty of great games to play through in the meantime. "We will release it when it's ready" is a mentality that has sadly been forgotten over the past several years.


> As for waiting years between installments, we have plenty of great games to play through in the meantime. "We will release it when it's ready" is a mentality that has sadly been forgotten over the past several years.

It's very questionable if we do have plenty of great games. And there is a difference between rushing a release and spending 15 years between releases.

Putting it as waiting years is presenting it in a false light, it's waiting decades. It's very possible the time between Skyrim and the next installment is 20-years.

GTA5 was 10-years ago and has been on 3 different generations of consoles. It could very well be that GTA6 doesn't even get release on PS4 because it'll take so long.


I don't see it any different to films in a franchise or novels by an author. They can work at whatever pace they want to. We aren't entitled to their efforts and we have no claim over how long they choose to spend making something. I'm just glad they do.


> We aren't entitled to their efforts and we have no claim over how long they choose to spend making something.

I think as paying consumers who they keep selling to there is a level of entitlement. The whole "they don't owe us anything" doesn't really work, they kinda owe us their success. Especially when they re-release the same game multiple times instead of a new one.


More frustrating, I don't think those games necessarily need that much time, it's also Bethesda spacing out the main titles of their major franchises evenly so they don't cannibalize each other. Every time Starfield got delayed, everything after it was delayed too.


Elder scrolls has it's MMO at least (which as someone who hates MMOs doesn't mean much) but agreed, it's insane that we went from 4 years between morrowind/oblivion, then 5 for skyrim to likely 15 years for another mainline game. Especially since it feels like morrowind to oblivion probably had a bigger change in gameplay style than I'd expect from jumping from skyrim to the next one.


Maybe I'm an outlier because I play tons of games that are 10+ years old, but...

Skyrim w/mods is totally viable to me as an open-world RPG for years to come. I skipped Fallout 4, but if I had a refreshed (optimized, less-buggy, Linux-friendly) version of Fallout: New Vegas I wouldn't be concerned about getting another FO entry anytime soon either.

Hugely excited for Starfield though, it scratches an itch I've had for years: blending Mass Effect, Deus Ex, Fallout, and Elder Scrolls.


>Maybe I'm an outlier because I play tons of games that are 10+ years old, but...

To be fair, everyone is playing 10+ years old games. GTA, Skyrim, Fallout, etc. The next versions haven't been released and are still all years away.

> Hugely excited for Starfield though

I pre-ordered it which is huge for me since normally I just wait years until the price drops. I just picked up Cyberpunk at the weekend which should show how long I am willing to wait before getting a AAA game.


I'm contemplating a Starfield pre-order as well. Last game I pre-ordered was perhaps Hearts of Iron 4. I STILL haven't bought Cyberpunk 2077; until a game hits 75% off on my Steam Wishlist it doesn't exist. If that means I don't play it until 5 years after its release...that's ok.


I don't understand how this can possibly be true. Game devs make nothing, how did it cost more than a million dollars per (peak) employee working on the game? Are there some significant costs other than employee salaries?


Even assuming those costs were _only_ employees and don't include licensing costs, contract costs for major voice talent, etc then you're still looking at about 1,000,000 per employee over the course of the games development. If a game takes 5-6 years for development, then you're looking about 200k average total employee cost per year, which once you factor in things like healthcare and other expenses an employer might have to pay doesn't seem too wild to me?


Probably stuff like offices, hardware, software licences, marketing, server costs and more together with the fact that they worked on it for 5 years (that would mean only 200k per employee) and had all those other costs for those years as well


I'm guessing there's some hollywood accounting going on too


also contractors as others have already mentioned


Fully loaded cost will be could to 2x salary. Spread over the entire game's development, this would come out to a couple hundred k per employee. This isn't insane.


My understanding is that a lot of work on TLOU2 was performed by contractors.


Maybe they aren't counting contractors as employees?


1 million per employee over 3 years is about 150k-200k net per employee per year (if it was salary only). 3 years is a low estimate and salaries are far from the only thing companies spend money on


According to wikipedia, The Last of Us Part II took 6 years to make, with over 2,100 unique people working on the product. The 200 employee number clearly isn't including contractors.

Even if we excluded the contractors and only included the employees, that's $183,000/employee/year. That doesn't seem that insane, considering that's the upper-bound cost per employee. One you start to add in any other expenses and the contractors, it would quickly approach reasonable.


I can't quite make out the actual number but Sony said TLOU2 took longer than Horizon Forbidden West's 5 years to develop. Let's just call it 6 years to make it an easy number. $220m over 200 people over 6 years ~= $200k/year/person? That seems if anything low, no?


> probably means Fallout is 10 years away.

unless Microsoft gets obsidian to make 'New Vegas 2' on the fallout 4 engine.


The marketing cost is not included, the budget for the average AAA game is probably very close to half a billion.


Yeah, I've also noticed that sharpies aren't effective for this. Scanners scan with 256 levels of black, so this can be enhanced to bring out the inked over characters. Some solutions:

1. use a grease pencil. The grease is opaque.

2. scan in monochrome black & white, and turn up the sensitivity of the scan.

3. use MS-Paint to draw a filled rectangle over it.

4. black tape


The best thing I do is scan the file digitally and use PDF blackout. And then “print to image” to a PDF printer.

Other PDF editors have a pdf redaction feature that replaces the image pixels with white space. That works too.


> Sony redacted the documents with a black Sharpie — but when you scan them in, it’s easy to see some of the redactions

Man, I'm dumb. Does this mean they printed out stuff on paper, put ink on that, and scanned it back in?

Because if it was done by hand, those black strips are damn regular.

???


That or they used a paintbrush in a digital app that isn't 100% opaque (many have alpha effect).


Sounds like it, and I've seen it ventured that it was an intentional, but deniable, leak - to help scuttle a studio acquisition by showing the massive market impact.


Related, I've always had trouble finding a black highlighter to purchase.


it is plausible that someone who is a calligraphist did this redaction, they would have that kind of hand steadiness.


> almost half of PlayStation 5 owners in the United States also own a Nintendo Switch

They say that this data comes from an "internal survey". I'm guessing PS5 consoles are using bluetooth to read whatever other console's controllers are in range and reporting that back to Sony.


Nah I’m sure it’s literally just a survey.


It'd be pretty bold to go around making claims about ownership rates across the entire country based on a poll of employees. Not only is that a tiny sample limited to specific areas of the country, people who work for the video game industry are probably a lot more likely to have multiple consoles than your average person.


Or, internal survey just means they didn't publish the results, not that they only polled employees.


I'd guess not publicly published, but it wasn't "internal" in the usual sense because it was found on confidential documents Sony was sharing with publishers. I'd hope they weren't giving publishers inaccurate data based on a biased employee survey so I assumed they'd use the easiest technical means of getting accurate numbers available to them, which would be polling nearby bluetooth devices.

Other ways they might get that kind of data would be collecting it from data brokers, or by analysis of PS5 player's conversations/in game chats and messages.


You're really, really reaching here. It's faster, cheaper, and more accurate to just hire a survey firm to ask people how many consoles they have in their house.


How is finding, drawing up contracts for, then paying a separate firm who then has to find a representative sample of people willing to answer their survey, and then waiting for those results faster, cheaper, or more accurate than just logging data their PS5s already collect every single time they are turned on? They're getting real time data right now for free.

On the accuracy side, the only catch is that you might detect consoles that belong to someone else in the household. In some settings, like apartment complexes, they might log a device belonging to a very close neighbor, but otherwise it's a pretty good measure. I'm willing to bet they could filter most of that by signal strength alone.

I'm just saying that there are a lot of ways sony could be collecting this data, but one of them is clearly more likely since anything else is going to cost more, require more work, and be less accurate/inclusive.


> Sony says 1 million PlayStation gamers play nothing but Call of Duty. My colleague Sean Hollister has analyzed the document, and it appears to show:

> In 2021, over [14?] million users (by device) spent 30 percent or more of their time playing Call of Duty, over 6 million users spent more than 70% of their time on Call of Duty, and about 1 million users spent 100% of their gaming time on Call of Duty

It certainly helps that the game hit 250GB+ even back in the MW2 days, such that you basically have to empty everything else off your console to keep MW2 on it.

Nice little perverse incentive to not care about disk space/not properly implement the ability to take singleplayer off (you can take multiplayer off though! in... COD... meaning it's single-player-only COD...)


Not the first time Sony has run into trouble with Sharpies.

https://en.wikipedia.org/wiki/Key2Audio


The Last of Us Part II costing that much is absolutely crazy, considering the drop in story quality and the shattering of character's growth and arcs. It makes me wonder where the money went to.


Can anyone here actually read the claimed text on those images? I'm seeing literally nothing there but compression artifacts. Are there higher quality images somewhere?


This looks like the result of that weird compression "algorithm" that was transposing digits on scientific papers, not actual figures from the report.


I can. Just zoom in a bit.


Some lawyer goofed preparing a document and rather than respond with courtesy and grace, the Verge writes a story amplifying the error. I don't care about video game revenues but I think this trend of "gotcha" journalism is regrettable.


I understand that it's in fashion with a certain crowd to attribute to journalists every sin imaginable, but how is "you tried to hide something relevant, and you didn't hide it, so we're reporting what you tried to hide" somehow "gotcha journalism". What constitutes "courtesy and grace" here? "Well, gee-wiz Mr. GiantMegaCorp, we're real sorry we legitimately received some information you don't want disclosed, so as our solemn duty as 'journalists' we want to know if it's okay to publish it...you know, only if it's okay with you...totally fine if you want us to pretend we never saw it"? What's exactly 'regrettable' here, that a journalist didn't bury an inconvenient fact? How is this a 'trend', as if pointing out when people with power screw up never happened until recently? Before the current fad of "anything I don't like is fake news", this would have just been 'journalism'.

Of course, these are also the kind of folks who call people like Bob Woodward and Carl Bernstein 'traitors', so what do I know.


There wasn't any real claim the redaction was illegitimate, only that the attorney for the party who wanted to protect it used the wrong technique to protect it. If the journalist wanted to publish the ... above-board way to do it would have been to ask the judge to order an unredacted version filed, or a change in the redactions... which I don't see any reason here a judge would do that.


Call of Duty revenue for PlayStation for 2021: $15.9 billion


That's incorrect. That's the total spent by players of CoD.


I'm not an accountant, but I think that's what revenue is.


You missed the point. It isn't the money spent ~on~ Call of Duty by Playstation users, it is the money spent ~by~ Call of Duty Playstation players on Playstation products/services/etc overall (some of which have nothing to do with Call of Duty).


Lets say you have a hotdog stand. If I buy a single hotdog from you this year for $3.00, you get $3 in revenue. The fact that I spend $4800 total this year on eating out doesn't mean you have $4800 in revenue.


As in, people who own Call of Duty are responsible for that much spending across the PS catalog


It’s a part of revenue.

Pretty sure they have some licensing revenue (clothes, accessories, products,etc) and, possibly, some exclusivity income.


Well, the sharpie secretary is definitely fired. But this is nothing compared to the Apple employee who left the iPhone 4 prototype at a bar.

Steve Jobs' minions made sure that man would never be physically capable of holding an iPhone ever again.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: