kde-i18n-doc is the place to get feedback, we don't really care about the feedback of random people that have never translated any KDE software.
- Queries
- All Stories
- Search
- Advanced Search
Advanced Search
Wed, May 14
As this is a long term task, maybe it is helpful to ask someone to give feedback on the workflow. Dina Nouskali could probably be very good at coordinating this, as Dina comes into contact with a diverse set of KDE users and is also multilingual.
Sun, May 11
In T11070#318629, @ariasuni wrote:Hi, I’m wondering how to move this issue forward.
Hi, I’m wondering how to move this issue forward.
Jan 21 2025
There's really no good status to say that
https://invent.kde.org/teams/localization/issues/-/issues/1
is the followup story.
Jun 23 2023
Jun 18 2023
I moved this patch to GitLab: https://invent.kde.org/sdk/lokalize/-/merge_requests/60
Oct 18 2022
autocorrect moved to calligra were at least it's being used https://invent.kde.org/office/calligra/commit/c6c6c899a3b640555df551d96d0668bb16ae0042
Oct 17 2022
Oct 12 2022
khangman did not have any kvtml file
https://websvn.kde.org/trunk/l10n-kf5/nn/data/khangman/khangman/?pathrev=1605218
Oct 11 2022
@aacid, I see that khangman is marked as ‘done’, but I can’t find the .kvtml files (e.g., for language ‘nn’) in the khangman Git repository. Only the .txt file containing the list of extra characters in the alphabet has been added. Are the .kvtml files stored somewhere else?
Oct 10 2022
Oct 8 2022
Thanks for opening that request against Lokalize to plug into Weblate's existing API. A workflow that uses Weblate to remove the need to deal with git/svn to encourage contributions from non-Computer Science types while still allowing for those who have used Lokalize to continue using Lokalize is the best way forward I feel.
I agree it could be a good solution but I'm looking at it from a different angle.
Oct 7 2022
Online tools generally slow down high-volume translators. Plus, they are slow, and require an internet connection. Downloading and uploading translations are always a hassle, and error-prone.
Aug 13 2022
Jul 31 2022
Would someone (@ltoscano ?) be able to give a short status update on this at Akademy? Where discussion/work/testing is required?
Jul 25 2022
Last activity here was a year ago, has there been any further thoughts or updates on this?
Jun 13 2022
May 30 2022
autocorrect is a PAIN because it's used by both pimcommon and calligra, so were do we put it? Currently we're not putting it anywhere so it's not being even used ^_^
May 25 2022
May 22 2022
I've removed the kturtle data, it was all for a very old version of kturtle, the new version of kturtle doesn't support those data files
Feb 23 2022
Feb 21 2022
Well, this tangent is kinda moot now since some chatting & it's probably not going to work out having Hebrew team work on the instance I set up for the Toki Pona team.
That's probably acceptable but I am going to maybe wrongly assume you don't also speak Hebrew?
I'm going to warn you against that, you can't create a system to automatically commit things from other people into KDE servers without human interaction, that's basically giving out your user for anyone else to [ab]use, and that's obviously not allowed.
@aacid That's maybe relevant to pootle, in Weblate there's an ACL and a string approval system.
I'm going to warn you against that, you can't create a system to automatically commit things from other people into KDE servers without human interaction, that's basically giving out your user for anyone else to [ab]use, and that's obviously not allowed.
Could we chat on a more real-time platform to hash out the details of that? My matrix is @pontaoski:kde.org and Telegram is https://t.me/pontaoski
@cblack Yes, I'd love that.
You want to use the Weblate instance set up for the Toki Poka team for the Hebrew localisation?
Sorry, I don't understand what you're trying to say here.
Feb 20 2022
@cblack What will it take to simply join them?
@cblack What will it take to simply join them?
Feb 18 2022
Another data point: the toki pona team uses a Weblate instance working directly against the SVN. Works fine, other than the fiddling of the internal checkout Weblate uses required to prevent it from dying on how many files we have in its "scanning" phase.
Jan 25 2022
The website has been refreshed and is ready to be extracted by Scripty.
Dec 19 2021
Mar 30 2021
In T13514#248309, @aspotashev wrote:Btw I've started slowly rewriting scripty in Go because
- I like Go more than bash.
- It's easier to make it more configurable than in the current state with a set of scripts calling each other. We need configurability to share codebase across branches and to let users run parts of scripty end-to-end (will be useful for www that needs more frequent syncs).
- it's easier to parallelize things in Go, so we could shorten the running time.
I didn't share my code yet. Don't expect fast progress, maybe I'll be able to release something by May 2021.
Just in case, this is the thing I'm working on, but it's still heavily WIP, especially the injector interface (something is already working with the example legacy extractor):
https://invent.kde.org/ltoscano/noktra/
Feb 10 2021
Done! (trunk r1592930, branch r1592931)
Feb 7 2021
That's also an option, simply remove the need for it 👍
The original solution with srDataMacros was appropriate for the time of single language packs, but in this case (where various maintainers would have to deal with it directly, even if in ecm) I think it would cause too much confusion.
The only way i can think of is, make it more generic, make it not use perl and move it to ecm
In T13618#240765, @aacid wrote:kdeedu-data ktuberling khangman are blocked by sr magicly shared cmake file cmake_modules/srDataMacros.cmake
Any idea how to proceed with it?
I guess we need to continue with this. I was told LFS works on our repositories, so at least we could resume this when there are no other blockers.
Feb 3 2021
Jan 17 2021
@huftis @aspotashev There are too many involved parameters, you can't measure performance like that.
Jan 15 2021
In T13514#248310, @ltoscano wrote:Please note I'm working on base replacement in python too.
The idea is to have something modular (starting from an independent translation extraction application which may be used elsewhere). I plan to share something before that deadline.A binary is not going to help much, as most of the time is going to spend on I/O on the disk, in my experience.
Jan 14 2021
In T13514#248308, @aspotashev wrote:Another problem with SVN is, it's slow. It sometimes takes at least 10 minutes to "svn cleanup" + "svn up" when only the translations from branches trunk/kf5 and stable/kf5 are checked out, making the full run of scripty slow on my laptop (note: I don't use SSD, that might be the reason).
Please note I'm working on base replacement in python too.
The idea is to have something modular (starting from an independent translation extraction application which may be used elsewhere). I plan to share something before that deadline.
Btw I've started slowly rewriting scripty in Go because
- I like Go more than bash.
- It's easier to make it more configurable than in the current state with a set of scripts calling each other. We need configurability to share codebase across branches and to let users run parts of scripty end-to-end (will be useful for www that needs more frequent syncs).
- it's easier to parallelize things in Go, so we could shorten the running time.
In T13514#237699, @aacid wrote:"Git should facilitate integration with external translation file trees maintained by Linux distributions (e.g. BaseALT)"
Let me rephrase this for you: "We're doing things wrong, can you please change your upstream behaviour so it's less painful for us to do things wrong?"
Jan 3 2021
In T11070#247390, @ltoscano wrote:Translations in the repositories would make the life of people who touches several repositories almost impossible, so it's out of the question.
The solution is to make weblate work with a structure which fits us (either svn or per-language git repositories).
Jan 2 2021
In T11070#247315, @sanecito wrote:
Translations in the repositories would make the life of people who touches several repositories almost impossible, so it's out of the question.
The solution is to make weblate work with a structure which fits us (either svn or per-language git repositories).
So I just tried to import all the summit translations to a local weblate instance (using docker). Unfortunately, the import failed after an hour with a timeout. Also, there are no native svn supports, it is just using git-svn. It looks like weblate is not optimized to handle big projects like KDE for now or at least not in a 16GB RAM/i7 CPU laptop. Also, the import was single-threaded :(
Migration to Git is over, abandoning this patch.
Jan 1 2021
I think that switching to Weblate would cause more harm than good due to:
- slower performance
- hit-and-run" translators
@yaron
I think it's good idea.
Why not integrating Lokalize with Weblate? There's a great API and you can choose what to load, possibly having some background worker to prefetch and sync the strings, glossary, suggestions, optional screenshots to speed things up, TM, etc.
In T11070#238105, @yurchor wrote:In T11070#238104, @dkazakov wrote:Hi, @yurchor!
Could you please explain your total opposition to having Weblate as another source of translations?
Because I've used it for several years.
I think that we do not need "another source of translations". With Weblate/Transifex/whatever you will soon find that the team coordinator is gone, or do not want to approve your translations, or you do not want to approve somebody's translations. The problem was not with tools (again). One more tool just makes the current life harder. Just look at this thread. All the problems can be solved with better team management.
You are constantly saying that "the performance will be much lower", but noone in this thread says about forbidding direct pushes into svn/gitlab repository with translations.
No. Quite opposite. They say that it is hard/should be cut off/left for the "power" users.
More than that, @sanecito has just confirmed that one can lock the file in Weblate and work with it in his/her own way. Why can't you just lock the file in Weblate and go translating it in Lokalize?
Why should I lock it first?
Just an example. Today we have ~40 new strings in ~10 catalogs. I should go and find one of ~1000 KDE catalogs and lock it. ~5 mins of my time gone for nothing. I guess that overall probability that nobody wants to translate something is about 99.99%.
Another example. XX locked the catalog and gone. Who will unlock it for an abandoned language?
These locks are of no good for anybody except scripty or developers.
With such workflow Weblate will just work as a "translations coordinator", which distributes jobs among people. Where do I get it wrong?
Everywhere. First, most of the complaints are from people who do not want the distribution of jobs. They just want to change something and vanish.
Second, we are speaking about teams with 2-3 active translators (most KDE teams). "Coordination" in such teams is rather simplistic. You just translate, translate, and translate. Once in several months comes a guy/girl with a new translation which you have to review. Once a year, you have a correct bug report about a mistake in translation.
Almost all Weblate features are unusable in such a workflow. You just want to get rid of any coordination and go Launchpad/Rosetta way (or Transifex way like it is for MATE or Deepin), to be honest.
Excellent! Thank you for all the effort!
Dec 31 2020
I'm really sorry for the delay, and thanks for the help!
Nov 16 2020
Nov 15 2020
Nov 14 2020
ping?
Nov 5 2020
Oct 12 2020
I'm assuming we're waiting for Luigi to adapt makemessages and kick start the migration process. @ltoscano, is there anything I can help with to get the Git migration done sooner?
Sep 20 2020
Also, if there is just one author, maybe we could just commit the change using the proper authorship information.
I think we should remember to report all the authors in the commit messages when importing the content.
Sep 19 2020
kajongg only has german voices https://invent.kde.org/games/kajongg/-/merge_requests/2
kdeedu-data ktuberling khangman are blocked by sr magicly shared cmake file cmake_modules/srDataMacros.cmake
You're right about the step files, just removed them
Sep 17 2020
Sep 13 2020
- autocorrect: we need to find (separate issue) where those files should live (probably the last leftover from the kdelibs split)
- step files are probably not needed anymore, as the object info messages are now handled through po files (see step_objinfo_files.pot)
Aug 26 2020
It wouldn't be difficult to make it translatable (and the infra for translating hugo and jekyll websites exists now), but I would argue that the entire website could need a small refresh in term of content and design first.
Still relevant? I see the link is broken.
Aug 24 2020
In T13519#238118, @huftis wrote:In T13519#238083, @ognarb wrote:Storing the translations in the repo is already done with the StaticMessage.sh scripts in various website repositories. Currently, the StaticMessage.sh also post-process the po files, but I wouldn't mind transforming the po files to markdown/yaml/... at deploy time in the binary factory instead of using scripty for that.
Yes, that sounds like the better solution. The PO files should really be considered as source files, and then it makes sense that any files generated from them (.mo/.xml/.yaml/.desktop files) should automatically be generated at compile time.
The desktop files would then be .desktop.in files, and intltool (intltool-merge) could be used to generate a .desktop file from the .desktop.in file and the set of .po files – just the way it’s done for normal (non-KDE) applications which store the PO files in the source repository.
In T13519#238118, @huftis wrote:The desktop files would then be .desktop.in files, and intltool (intltool-merge) could be used to generate a .desktop file from the .desktop.in file and the set of .po files – just the way it’s done for normal (non-KDE) applications which store the PO files in the source repository.
In T13519#238083, @ognarb wrote:Storing the translations in the repo is already done with the StaticMessage.sh scripts in various website repositories. Currently, the StaticMessage.sh also post-process the po files, but I wouldn't mind transforming the po files to markdown/yaml/... at deploy time in the binary factory instead of using scripty for that.
In T11070#238111, @sanecito wrote:The problem was not with tools (again). One more tool just makes the current life harder. Just look at this thread. All the problems can be solved with better team management.
No it can't. While technically there is a transfer process happening from SVN to Git, the problem of 'localizers should know SVN' is not being addressed. It's great that you and others put in the time to learn SVN, but telling linguists that no, they must first learn SVN to contribute string suggestions is utterly ridiculous and offputting. It's like telling people they can't make a fire with a lighter and instead they have to use a wood spindle simply because you spent time learning how to use a wood spindle. There are better tools in the world; KDE does not benefit by being stuck in the past using wood spindles over lighters. If you read the actual original ticket, the issue is not that coordinators have to sign off on translations, but rather the process to get, write, and submit suggestions is absolutely painful.
People are getting incredibly stuck up on approval being 'have X' votes. It doesn't have to be this way if this is an issue for people. You can have the old process in Weblate: someone submits a suggestion via Weblate, and then a team coordinator signs off on it/'submits' it. Coordinators can easily find these by either setting up notifications when a new suggestion is made for x component with y language, or just select strings w/ X language suggestions from the dashboard.
On a separate not unrelated to the tool, it's worth noting that at some point you have to make someone a team coordinator if there are no active ones for the language. Team coordinators of the past weren't magically given commit access, KDE granted it to them through some process or lack there of. Whatever process KDE used to say to establish the first Italian, Russian, etc team coordinators is the process that you can use going forward when dealing w/ inactive coordinators or the lack there of.