That documentary is a must-watch for any Miyazaki fan. They recorded the entire creation of Ponyo, from Miyazaki's watercolors to the final released movie.
It also shows why the movie ends as abruptly as it does and why Ghibli was eventually disbanded. With Takahata gone everything was bottlenecked by Miyazaki's himself, and he was simultaneously crumbling under the pressure and refusing to let anyone else have creative input in the studio.
A few years ago, I decompiled a good part of the PC version of Might & Magic 1 for fun. According to Wikipedia, it had been released in 1986, although I don't know whether that refers to the PC version or to the original Apple II version.
It is a quite big game: the main executable is 117KB, plus around 50 overlay files of 1.5 KB each for the different dungeons and cities, plus the graphics files. I guess it was even too big for the average PC hardware at that time, or it was a limitation inherited from the original Apple II version: When you want to cast a spell you have to enter the number of the spell from the manual, maybe because there was not enough memory to fit the names of the 94 spells into RAM. Apart from that and the limited graphics and the lack of sound, the internal ruleset is very complete. You have all kind of spells and objects, capabilities, an aging mechanism, shops, etc.. The usual stuff that you also see in today's RPGs.
The modern uninstall.exe that came with it (I bought the game on GOG) was 1.3MB big.
>When you want to cast a spell you have to enter the number of the spell from the manual, maybe because there was not enough memory to fit the names of the 94 spells into RAM
Probably not ;) "Enter things from a manual" was a tried old copy protection technique. If you used the warez version you presumably did not have a manual so you got stuck. This didn't run on the 8008 or whatever, I'm sure the game could have known the names of spells fairly easily.
Ah, that makes more sense than my theory. It's a weak copy protection method, though, as you can just try and see what happens, and I think they dropped it in M&M3.
Yes, and it was pretty easily photo-copied since it had to be printed all in one place anyway. That's probably why even print-based protections tried to get cleverer. Like the code wheels, although I remember those didn't take that much more effort. Disassemble the original, copy all layers, cut out the right holes, put back on a spindle.
I remember one game I had that tried to protect against it by having a manual of about 100 pages, with the passcodes being spread across all of them. I believe it was Gunship 2000.
Some of the models contain a rail segment (Märklin etc.). Was he a model train enthousiast and the houses were part of a layout, or did he use the rails just as accessories? Strange that the articles don't say anything about the artist's motivation.
That's funny because I have two objects on my desk for which I know that they use 555s. One is a no-name joystick with "autofire" function from the late 1980's. The other is a mass produced motor controller from the 2000's where the 555 generates the PWM signal for a FET.
I don't know much about app development, but I was curious and downloaded the Albert Heijn apk for ARM64. Inside the apk, the three largest entities are:
- libflutter.so 140 MBytes (flutter, obviously)
- flutter_assets 29 MBytes (this is a directory. The name is a bit misleading because it mostly consists of AH-specific icons.)
- libapp.so 20 MBytes (also related to flutter, I think)
There is a 640 KByte json file in the assets that stores an animation in base64 format. Now you know what the CPU and storage resources of your devices are used for nowadays...
I'm curious to know what the problem of Firefox is. For example, the 3d-raytrace-SP benchmark is nearly three times faster on Edge than on Firefox on my i7 laptop. The code of that benchmark is very simple and mostly consists of basic math operations and array accesses. Maybe the canvas operations are particularly slow on Firefox? This seems to be an example that developers should take a look at.
> Maybe the canvas operations are particularly slow on Firefox
That seems likely. WebRender (Firefox's GPU accellerated rendering backend) doesn't do vector rasterization. So Firefox rasterizes vectors using the CPU-only version of Skia and then uploads them to the GPU as textures. Apparently the upload process is often the bottleneck.
In contrast, Chrome uses (GPU-accelerated) Skia for everything. And Skia can render vector graphics directly into GPU memory (at least part of the rasterization pipeline is GPU accelerated). I would expect this to be quite a bit faster under load.
It's a known problem, but I hear that almost all of the Gecko graphics team's capacity beyond general maintenance is going towards implementing WebGPU.
---
SpiderMonkey is also now just quite a bit slower than V8 which may contribute.
I think the volume is what defines the cost for cargo shipping so your 10x is likely pessimistic. $2.50 is both significant relative to the price of one these jeans and not much compared to what has been the expected cost of a pair for decades.
It says more about how dirt cheap shipping is the single enabler of globalization, even doubling the cost may not be enough to significantly shorten the supply chains.
"OSS projects have been able to gain a foothold in many server applications because of the wide utility of highly commoditized, simple protocols. By extending these protocols and developing new protocols, we can deny OSS projects entry into the market."
reply