CPCEC GTK missing in action

Started by CPCBEGIN, 20:44, 13 September 22

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

CPCBEGIN

CPCEC GTK is a fork for CPCEC for Amstrad CPC linux developers, it could be download from https://gitlab.com/norecess464/cpcec-gtk but link dissapears.
Have this emulator anybody and share it for everybody?

Retro & opensource

Gryzor


norecess

Yeah I just moved all my stuff from GitLab to Bitbucket.

For the trivia: after years being friendly for "free" projects, GitLab started to get greedy lately (from "free" to 19$/month in my case!). I looked quickly at GitHub and the limits were against me too (1Gb repo max. storage, 100Mb max file size). I ended to Bitbucket, no limits, private repos, all good.

The new URL: https://bitbucket.org/norecess464/cpcec-gtk/src/master/

Gryzor


pelrun

If you're trying to store large binary files in git repos (and not leveraging content-addressed-storage like git-lfs) you're doing something really wrong :laugh: Even then it's hard to hit 1Gb!


CPCBEGIN

Thanks for share this great emulator.
Retro & opensource

cpcitor

Quote from: norecess on 21:31, 13 September 22For the trivia: after years being friendly for "free" projects, GitLab started to get greedy lately (from "free" to 19$/month in my case!). I looked quickly at GitHub and the limits were against me too (1Gb repo max. storage, 100Mb max file size). I ended to Bitbucket, no limits, private repos, all good.

The new URL: https://bitbucket.org/norecess464/cpcec-gtk/src/master/

Good to see that.

Quote from: norecess on 21:31, 13 September 221Gb repo max. storage

cpcec-gtk .git directory occupies 671Mbytes (according to du). More than half the limit.
As a comparison, in cpc-dev-tool-chain the .git directory occupies 39Mb. That's 25 times smaller than the limit.

The solution: do not create a problem, there is no problem before it is actively created. That is:

1. Do not commit binary files (no executable for any OS, no libraries for any architecture, no sqlite database, no debian package, no binary HLP file, no . o file). Repo is for source.
2. Do not commit generated files (several of the aboves, backup files `*~` ). Repo is for source.
3. Do not commit files that contain paths specific to your machine. Repo is shared to others.

Also, do not copy content of other projects into the repository. Fetch them on demand.

Finally, before any commit, review changes and stage for commit only relevant files.

Optional tip: populate a .gitignore file to help filtering irrelevant files.



There are exceptions that apply to cpc-dev-tool-chain:

1. PNG images that are part of readme/documentation.
2. Models for test cases.
3. No exception. Publishing a file that contains a path specific to your machine... what will predictably happen on all other machines?

Quote from: norecess on 21:31, 13 September 22100Mb max file size

In my local cpc-dev-tool-chain, the biggest blob in the .git directory is a 461kb test case log, to ensure that the fast multiplication routine works for *all* cases. Those test models are the only files above 100kb. That's 1000 times smaller than the limit!

A lean repo is faster for all operations, easier for everyone, and with such a repo, all hosting platform are friendly. That probably contributes to remaining on github: never had any problem.

When we code for the CPC, we take care of each byte and each cpu cycle. Why should we waste hundreds of megabytes on the hosting platforms, user bandwidth, etc?

(If you wish to put the repository on a diet and purge heavy useless content, I can provide some assistance.)

My 2 cents.
Had a CPC since 1985, currently software dev professional, including embedded systems.

I made in 2013 the first CPC cross-dev environment that auto-installs C compiler and tools: cpc-dev-tool-chain: a portable toolchain for C/ASM development targetting CPC, later forked into CPCTelera.

norecess

> Repo is for source.
I understand what you mean but there is nothing preventing me to submit data in a source control.

I switched to Bitbucket because I have other projects that are big archives, too. I like the fact I can submit stuff incrementally over the years, and that's perfectly fine (to me) to consider git as a cloud storage at some point.
 
> Why should we waste hundreds of megabytes on the hosting platforms, user bandwidth, etc?
Because PC are made to suffer ? or maybe because it's 2022. Anyway.

I will continue to apply my own guidelines on my own project(s), if you don't mind ;)

Powered by SMFPacks Menu Editor Mod