One person on Discord said:
“BadMrBoxToday at 4:08 PM
Another 9.6gB patch? That’s only acceptable if the whole game have been redone lol.”
My response was:
⛧⛧⛧Today at 4:16 PM
just MAYBE they can split it up in the future?
make the file 3 to 4 or even more parts?
would solve a lot"
I believe that splitting up large files into smaller files (maybe 10 parts, each of about 1GB appx) would solve this.
It would also save on upload for Avalanche (thus save money through saving bandwidth) AND download for those limited in transfer sizes by their ISP as well as time for the slow connections.
Would you also claim that if YOU were on an 8baud modem?
Note, that would take a picture of 5MB half a day to download…
Only us old folks remember those, these were a fact though…
Still think this one over… then be so kind to tell me your opinion.
Do you really think that when devs release an update, they only change some files? If so, then how the new content in the game is supposed to happen? For that, new files are needed.
That’s one thing. Another thing is that Micro$oft and Sony (console makers) doesn’t accept multiple updates per month, making update split impossible. That is most evident with the February’s Hotfix which PC players got as soon as it was released while console players had to wait until monthly update for consoles rolled in.
Internet connection speed is limited everywhere. No-one has unlimited bandwidth, even when you’re sitting on the fiber-optic cable. Though, for online gaming, about 100 Mbit/s down and 10 Mbit/s up is decent enough for download and sync with other players, unless your ping isn’t through the roof.
Miss… I am not speaking over more than one update, but actually cutting large files in parts.
So, instead of a 9.6GB download, it could be 3GB over three affected files.
No more need for the 9.3GB file, but 10 different parts made of the big file?
And that helps you in how? Since the download time is same or even longer for all the parts combined.
E.g: 1x 9.6 GB file = 40 mins vs 10x 960 MB files = 5 mins per file = 50 mins for all of them.
Also, from Steam, you can configure when files are downloaded. With auto-updates enabled, they are downloaded as soon as they are made available but you can define it. E.g don’t download when you’re playing something or don’t download at all unless you specifically allow it.
If you edit 500mb over 4 ‘parts’ of 1GB… compared to a single massive block of 9.6gb?
it’s easier to upload 4 separate parts for 4GB than a single block of 9.6GB.
MUCH faster, less demanding of hardware and netware, no?
If you upload 16 GB in 4x parts of 4 GB, total upload time will be longer than uploading 9.6 GB in one single file since 16 GB has more data in it than 9.6 GB.
Also, it doesn’t have any demand on hardware if you download one big file compared to several small ones. Though, several files can have bigger impact on your storage drive since it has to start and end write cycle 4x times (with split files) rather than 1x time (if it would be one big file).
Idea is not to split up the TIMES to patch up, but the huge 9.6 GB chunk into smaller bits.
Someone recently ‘complained’ needing 5 hours for one patch…
Sir BadMrBox then said what I copied in the first post.
They are not wrong, miss.
And my idea just might help both these lesser fortunate with poor netspeed and limited load sizes, as well as Avalanche…
Note… MIGHT (in Avalanche’s case).
Chopping up a data block… I do not know how hard that is, and how much it requires, resource wise.
I do not mean harm, miss… I only try to help out by giving a simple idea…
Steam uses binary deltas that are automatically created when you upload your build. It compares the uploaded current build to what the user has installed on their computer, then sends the binary delta for that version combination. Games tend to use a few very large files instead of lot of little ones (like they used to in the past) because it’s faster for the computer to copy/read one large file than lots of little ones even if they take up the same size on your drive.
In order to make large files, the engine compiles the game data into those large files. Some compilation is basic - just sandwiching the files together, and some is more advanced, where information not needed is removed to streamline the loading process and file size. Because of this, if you update anything when files are sandwiched - even the access date on the file - you get the delta binary sent the will include that change. If it’s more advanced compiling, the change can radically modify the entire contents of the result of compilation which results in a very large delta binary.
Additionally, some compilations will split similar things into different large files (so one large file may contain a model of a machine, another contains one group of textures of the machine, another contains another group of textures for the machine, and yet another contains the AI of the machine), making the delta binary even larger. It’s based on the engine and the development tools.
It’s not as easy as “just split them apart”. There’s a reason that developers do this. They are absolutely aware of how their engine I/O works, compilation options, deployment building, and associated deployment functions operate. It might suck to have to download a lot -once- to patch the game, but you make up for it because load times are faster -each time you play- and bandwidth is cheap for the majority of users.
It quite easy win 7 or win 10 to add your mobile phone to your current connection and combine download speeds.
4G connections are very common, and can easily give download boost you need. (unlimites data plan helps or required)