I suggest everyone (and I literally mean everyone) stops commenting on this ticket for 48 hours.
- Queries
- All Stories
- Search
- Advanced Search
Advanced Search
Aug 24 2020
In T11070#238107, @yurchor wrote:In T11070#238103, @zhigalin wrote:In T11070#238101, @yurchor wrote:I think there is some infantile way of think that we see here: "I do not want to send translation in the Russian ML or to Russian coordinator (that's the requirement). I do not even want to know the team policies. I want to do it my way! The problems for other teams are acceptable to me!" That's how it sounds now.
Oh, wow.
I didn't wanted to seem rude to you so I didn't wrote anything about it but if you are speaking like this I might as well go on:
You in all the messages you wrote so far, including the one I am replying to, displayed exceptional infantile clinginess to the current system to the point you had to assign words a different meaning unique to your understanding, and do cheap manipulations like selective blindness and putting your own words into the mouth of your opponents.Oh, yes. The current system is awful (no doubts) because it uses Subversion that you do not want to learn.
First, using SVN in 2020 is like using Cobol.
The only possible motivation you would be willing to learn Cobol in 2020 is because you're doing it for a bank which pays you a ton of money.
Second, the problem is not the VCS used. If it would be git the issue would be pretty much the same.
The problem was not with tools (again). One more tool just makes the current life harder. Just look at this thread. All the problems can be solved with better team management.
In T11070#238054, @pshinjo wrote:In T11070#238043, @clel wrote:You might have some good points here. I understand that for an active team there is not much need for such a system. However, almost no language has all Strings translated already when looking at: https://l10n.kde.org/stats/gui/stable-kf5/team/. Quite the opposite if you look how much untranslated Strings there are for many languages. So apparently almost no team is active enough to reach a completeness level of 100 %. Thus I assume they lack the time and manpower.
Just to say, GNOME's l10n system is called "Damned Lies" (https://en.wikipedia.org/wiki/Lies,_damned_lies,_and_statistics). Yes, only a small number of teams reach 100% in the statistics. I personally do not aim 100% there because:
- websites-kde-org/www_www.po contains ALL release notes dating back from KDE 4.x era. Even if I translate 100% of them who will practically read all of them? Also I don't direct other translators to touch them either.
- Big specialized applications: I would name Krita, KStars, GCompris, RKWard as examples. They require specialized domain knowledge for usage (and correct translation). For some languages, even finding a user who is using the application in English (and eventually willing for a translation) could be hard enough.
In T11070#238099, @zhigalin wrote:You and someone other seems to be making a mistake by equalizing our code contribution barrier and our translation contribution barrier.
They are not the same.
A developer unable to use git and collaboration platforms like Phabricator and Gitlab is not a good developer.
Devs can pass the code contribution barrier by using their developer skills.
But for a translator the usage of SVN (or other VCS) and the ability to handle handle PO files is not a part of their area of competence.
Sure, they can learn it, if they need it.
But as formerly said, they don't need it.
Because this is a purely volunteering part (unlike for devs who are getting experience and contributing to their CV).
The fundamental difference between the code barrier and the i18n barrier is that the skills necessary to overcome said barrier are inside the area of competence for devs but outside for translators,
so, if a dev can't handle it then he is not good enough to be allowed to contribute, but if a translator can't overcome the current barrier this does not mean he is not good enough.
In T11070#238106, @safaalfulaij wrote:Hello all.
I have been watching this for a while now, and I have to admit it: there are good points from both sides here.My take is this story:
Let's imagine a semi-inactive language team X. After a period of time, a new translator appears! Great.
They start translating using the great nice-looking web interface, and starts sharing it with friends and social media.
Next day, 20 new translators appear! Great. They work randomly on what they feel they want to contribute for, and also share that with thier friends.
Several days later, 40 new translator appear! Now all of the 60 + 1 translators start translting and voting for each other, and things are going just fine.After a period of time, a new user moves to KDE, and finds the chess game. Time to play chesss! While playing, he reads "Checkmark". Hmmm... whats that? Did they mean "Check"?
Opens a bug, a developer seeks the language team maintainer (that first translator). The translators investigate and find that 40+ people approved/voted for "Checkmark" as a translation of "Check" [mate] in a chess game. He reverts that, and a war starts.Where is the error? Sharing what you love? People voting on strings even though we set the criteria so high (10)?
This is how I see it: By using Weblate-alike interafce, we set the barrier too low that anyone (those 60 weren't all translators, but people who understand both English and X language) can contribute without any mentoring/checking/guidance etc etc.
In T11070#238103, @zhigalin wrote:In T11070#238101, @yurchor wrote:I think there is some infantile way of think that we see here: "I do not want to send translation in the Russian ML or to Russian coordinator (that's the requirement). I do not even want to know the team policies. I want to do it my way! The problems for other teams are acceptable to me!" That's how it sounds now.
Oh, wow.
I didn't wanted to seem rude to you so I didn't wrote anything about it but if you are speaking like this I might as well go on:
You in all the messages you wrote so far, including the one I am replying to, displayed exceptional infantile clinginess to the current system to the point you had to assign words a different meaning unique to your understanding, and do cheap manipulations like selective blindness and putting your own words into the mouth of your opponents.
Hello all.
I have been watching this for a while now, and I have to admit it: there are good points from both sides here.
In T11070#238104, @dkazakov wrote:Hi, @yurchor!
Could you please explain your total opposition to having Weblate as another source of translations?
Could you please explain your total opposition to having Weblate as another source of translations? You are constantly saying that "the performance will be much lower", but noone in this thread says about forbidding direct pushes into svn/gitlab repository with translations. More than that, @sanecito has just confirmed that one can lock the file in Weblate and work with it in his/her own way. Why can't you just lock the file in Weblate and go translating it in Lokalize?
In T11070#238101, @yurchor wrote:I think there is some infantile way of think that we see here: "I do not want to send translation in the Russian ML or to Russian coordinator (that's the requirement). I do not even want to know the team policies. I want to do it my way! The problems for other teams are acceptable to me!" That's how it sounds now.
In T11070#238067, @dkazakov wrote:In T11070#237887, @slavekb wrote:Weblate allows to download the MO file directly from the web interface. The user does not need any other tool to get the MO file.
Likewise, Weblate allows to upload a PO file if the user wants to use some offline translation tool.Hi, @slavekb!
Perhaps you know, does Weblate supports some kind of "locking" a translation file by the maintainer of the project? Such locking would really help people like @yurchor to do bulk-translations offline.
I mean, we could make a policy that "a translator can ask to lock a specific file for not more than N days(?)" to do the translations offline safely and push them into SVN/Git. And the rest of the time the files would be available for translations by other people through web-interface.
In T11070#238099, @zhigalin wrote:In T11070#237992, @yurchor wrote:Speaking anecdotally, I'd contributed through such a system fixing at least two small mistakes I came across and probably also translating some outstanding Strings (rating from what I did in the past on other projects). Also I think there are several more anecdotes like this around in the KDE community. Only they are hard to use as scientific justification.
O-o-h, yes! Can we lower the barriers for the code to have more anecdotes about KDE software?
Online translation tools can make access to translation easier (not even in every case) but they definitely make the translation process deadly longer. Everything comes with its price.
I would doubt that they make the translation process deadly longer in every case. Maybe in some corner cases that get more relevant for power users. That is why I think it is pretty important to be able to maintain their workflow, at least in a way there won't be substantial or maybe noticeable disadvantages.
They can have some Hawthorne effect (can be seen for UserBase and WikiToLearn) for the short runs but they stagnate in the long run because of their indisposable issues.
https://en.wikipedia.org/wiki/Hawthorne_effect
So if you just want to have more items in the team lists (most of them with ~0 % of contribution) then go on with the online tools. But be aware that the translation coverage and the quality will go down then. Dixi.
You and someone other seems to be making a mistake by equalizing our code contribution barrier and our translation contribution barrier.
They are not the same.
A developer unable to use git and collaboration platforms like Phabricator and Gitlab is not a good developer.
Devs can pass the code contribution barrier by using their developer skills.
But for a translator the usage of SVN (or other VCS) and the ability to handle handle PO files is not a part of their area of competence.
Sure, they can learn it, if they need it.
But as formerly said, they don't need it.
Because this is a purely volunteering part (unlike for devs who are getting experience and contributing to their CV).
The fundamental difference between the code barrier and the i18n barrier is that the skills necessary to overcome said barrier are inside the area of competence for devs but outside for translators,
so, if a dev can't handle it then he is not good enough to be allowed to contribute, but if a translator can't overcome the current barrier this does not mean he is not good enough.
Ah, yes, an "occasional contribution" does not mean a "two strings contribution".
It means "not a regular contribution".
It can contain 2 string as well as 200.
Hello here, as one of the people interested in this issue I was following the task for a while and here I am to throw my 5 cents:
In T11070#238091, @subins2000 wrote:
- Yes, the GNOME system has problems, it simplifies the translator entry, but is difficult for reviewers (who has other jobs) and might even discourages new contributors. We can do better
In T11070#238095, @subins2000 wrote:
What outcome do you expect for small teams? More translators (to become the large translation teams)? More translations (please explain how it can be implemented without strong coordination)?
Just a quick note: we can't and won't ever enable any automatic acceptance of strings.
It would like allowing automatic merging of code from people without commit permissions, and this is not allowed.
The final decision about the validity of a string must be done by someone with commit access.
In T11070#238091, @subins2000 wrote:Q: Who will most likely need it ?
A: Small teams, teams that maintainer gets busy later and leaves soon. These teams often bounce up after a while, but those who comes will go through the same burden the old maintainer started with. A simple online tooling will help someone who's new and want to change some few strings. Perhaps that contributor will keep on.
Read every point here and both sides have valid arguments. Want to clarify some points :
In T11070#238065, @dkazakov wrote:In T11070#238031, @kucharczyk wrote:[...] Those have nothing to do with the tool used unless the tool has major issues that drive people away,
That is exactly the problem we have at the moment. The current workflow drives new contributors away.
In T11070#238010, @yurchor wrote:In T11070#237998, @clel wrote:O-o-h, yes! Can we lower the barriers for the code to have more anecdotes about KDE software?
I don't get what you are trying to say here.
Everybody can register on gitlab then use online editor (web-based) to fix the code and make it better. Low enough, isn't it? So why not use just online coding? And where all those new coders eager to improve the code and distracted by the high barriers of the old-fashioned previous systems?
You might not believe me, but I've seen a few people using the gitlab's online editor to modify patches and make commits. It is useful in some circumstances.
And, yes, here in Krita we constantly strive to lower the barriers for the code contributions: we make scripts for building Krita on all three major platforms, we make building AppImages easier, so people could use their custom builds in their workflow without waiting for a release.
And we have quite a good progress, btw. We are getting more and more people proposing MRs on GitLib. There is a lot of new contributors proposing some nice small changes.
From my point of view, using that for the all the kde repositories would make a lot of sense and would also help distro that provides nightly build like openSUSE Krypton to also package the app translations with their KDE packages. Another advantage is that developers can more easily test if the translations are working on their app or not.
In T13519#238038, @pshinjo wrote:Periodical injection of PO files into source trees can make developers to avoid cross-checking git and SVN when there is a bug related to translation. Let's take an example of Krita bug https://bugs.kde.org/show_bug.cgi?id=408481. Although it was not Korean translation's fault, but if it really was then developers don't have to check Korean translation SVN tree and possibly bisect the revision where mistranslation had been introduced.
In T11070#237887, @slavekb wrote:Weblate allows to download the MO file directly from the web interface. The user does not need any other tool to get the MO file.
Likewise, Weblate allows to upload a PO file if the user wants to use some offline translation tool.
In T11070#238031, @kucharczyk wrote:[...] Those have nothing to do with the tool used unless the tool has major issues that drive people away,
Aug 23 2020
In T11070#238043, @clel wrote:You might have some good points here. I understand that for an active team there is not much need for such a system. However, almost no language has all Strings translated already when looking at: https://l10n.kde.org/stats/gui/stable-kf5/team/. Quite the opposite if you look how much untranslated Strings there are for many languages. So apparently almost no team is active enough to reach a completeness level of 100 %. Thus I assume they lack the time and manpower.
In T11070#238036, @aacid wrote:In that case I have to say that many users maybe just want to quickly correct some Strings for their application that they use.
This is the crux of the issue, you think random drive-by people translating 2 strings is a good thing, I am almost convinced it isn't.
In T11070#237987, @clel wrote:In that case I have to say that many users maybe just want to quickly correct some Strings for their application that they use. Having to get familiar with a rather complicated and for that task to complex offline workflow is a high barrier.
Periodical injection of PO files into source trees can make developers to avoid cross-checking git and SVN when there is a bug related to translation. Let's take an example of Krita bug https://bugs.kde.org/show_bug.cgi?id=408481. Although it was not Korean translation's fault, but if it really was then developers don't have to check Korean translation SVN tree and possibly bisect the revision where mistranslation had been introduced.
In that case I have to say that many users maybe just want to quickly correct some Strings for their application that they use.
Web-based translation tools make translations more accessible. They don't help you: 1. get more contributors/keep the current ones 2. help maintain translation quality. Those have nothing to do with the tool used unless the tool has major issues that drive people away, or the tool actively makes people translate worse for some reason.
@clel , the main point here is the focus on the tool, as if we open an online translation tool, this would aggregate more translators. Well, I really don't think so. And I have some examples.
I won't answer all your points this time, since I think it just won't make too much sense. There is clearly some misunderstanding on your side about my motives. You seem to think I would want to force you to use Weblate which is not true. Also I understand that using Weblate would slow down your workflow. However you use a problematic discussion style to outline that. To me it seems there is some truth to your points, but it is pretty hard to get to it, since the truth is sometimes covered behind some arguments that don't really make sense. This includes coming up with some non-standard definitions of words like "translator".
In T11070#238001, @clel wrote:In T11070#237992, @yurchor wrote:No. And you know what I mean. Typing, for sure, is as fast as for offline tools. But switching between translations is slow. And it eats plenty of time if you typing fast. It takes ~3-4 seconds just to switch between translations.
I just had a test with the dev tools in Firefox. It displayed me 1.45 s for loading the page with the next translation after submitting the previous one. I admit that this is noticeable, but accapteable and not really slowing down the translation process much. I tested on https://hosted.weblate.org/translate/osmand/ios/de/. Let me know if you have a different site where the performance is worse. Also, if it takes really 3-4 seconds for you, it might have to do with your setup. I don't say that to blame you, but to try to understand. That amount is clearly a bad user experience that one should try to avoid.
In T11070#238001, @clel wrote:In T11070#237992, @yurchor wrote:No. And you know what I mean. Typing, for sure, is as fast as for offline tools. But switching between translations is slow. And it eats plenty of time if you typing fast. It takes ~3-4 seconds just to switch between translations.
I just had a test with the dev tools in Firefox. It displayed me 1.45 s for loading the page with the next translation after submitting the previous one. I admit that this is noticeable, but accapteable and not really slowing down the translation process much. I tested on https://hosted.weblate.org/translate/osmand/ios/de/. Let me know if you have a different site where the performance is worse. Also, if it takes really 3-4 seconds for you, it might have to do with your setup. I don't say that to blame you, but to try to understand. That amount is clearly a bad user experience that one should try to avoid.
In T11070#237998, @clel wrote:Fedora's translations are broken this way almost every week. The only savior is that they change very rarely.
The similar GNOME project (DL) is also *very* fragile (broken every month or so).
Are there any bug reports about this? Basically if what you write is true, those systems seem to handle conflicts rather poorly.
Just read the gnome-i18n@ for the last several months. The complaints there are just 1/3 of the real cases when DL was offline.
https://mail.gnome.org/archives/gnome-i18n/2020-March/msg00120.html
https://mail.gnome.org/archives/gnome-i18n/2020-March/msg00097.html
https://mail.gnome.org/archives/gnome-i18n/2020-April/msg00045.html
https://mail.gnome.org/archives/gnome-i18n/2020-May/msg00055.html
https://mail.gnome.org/archives/gnome-i18n/2020-May/msg00038.html
https://mail.gnome.org/archives/gnome-i18n/2020-June/msg00041.html
https://mail.gnome.org/archives/gnome-i18n/2020-June/msg00020.html
https://mail.gnome.org/archives/gnome-i18n/2020-July/msg00019.html
https://mail.gnome.org/archives/gnome-i18n/2020-August/msg00030.html
Thanks, will do so. So there are no actual bug reports then?
In T11070#237992, @yurchor wrote:No. And you know what I mean. Typing, for sure, is as fast as for offline tools. But switching between translations is slow. And it eats plenty of time if you typing fast. It takes ~3-4 seconds just to switch between translations.
In T11070#237992, @yurchor wrote:In T11070#237987, @clel wrote:@yurchor I move the conversation here, since it seems to have nothing to do with the old task's topic anymore.
In T13514#237943, @yurchor wrote:Fedora's translations are broken this way almost every week. The only savior is that they change very rarely.
The similar GNOME project (DL) is also *very* fragile (broken every month or so).
Are there any bug reports about this? Basically if what you write is true, those systems seem to handle conflicts rather poorly.
Just read the gnome-i18n@ for the last several months. The complaints there are just 1/3 of the real cases when DL was offline.
In T11070#237992, @yurchor wrote:In T11070#237987, @clel wrote:@yurchor I move the conversation here, since it seems to have nothing to do with the old task's topic anymore.
In T13514#237943, @yurchor wrote:In T13514#237939, @clel wrote:In T13514#237910, @yurchor wrote:In T13514#237909, @clel wrote:Alright. Then I don't really understand what problems you have with Weblate. The things you wrote are too general for me to understand what the concrete problems are that you experience.
Sure. Just one thing that I do not understand is that people who do not really understand how the translation system works eagerly want to change that system.
When you write "Sure", I expect some more insights :) You wrote about problems you had but did not really give much detail about them. You talk about Weblate being much slower than offline tools while not mentioning which parts of the workflow you are talking about (admin stuff, translation itself, downloading and uploading PO files?).
All of them. There is no need for offline administration, translating every string through web interface is several times longer even in the zen mode, "downloading" new strings through Subversion then found what to translate in Lokalize takes ~5 seconds, analyzing big projects (Fedora is smaller than KDE now) in Weblate takes minutes. Uploading big files (libguestfs and its man, libvirt, Weblate docs, etc.) literally takes up to 10 minutes for just one file. I can imagine how long it would be to upload KStars, Krita and its docs (the last update required several dozens of files to be uploaded, the translation itself contains several hundred files), RKWard, KMyMoney or LabPlot.
Interesting. I only sporadically worked with online translation tools like Crowdin and Transifex, but never noticed any slow behaviour when translating strings. I assumed it would just be similar in speed to any offline tool. You write several times slower, that means a factor of 3 or higher? That only seems plausible for very specific tasks like searching for a wrongly translated String maybe that one wants to correct.
No. And you know what I mean. Typing, for sure, is as fast as for offline tools. But switching between translations is slow. And it eats plenty of time if you typing fast. It takes ~3-4 seconds just to switch between translations.
In T11070#237987, @clel wrote:@yurchor I move the conversation here, since it seems to have nothing to do with the old task's topic anymore.
In T13514#237943, @yurchor wrote:In T13514#237939, @clel wrote:In T13514#237910, @yurchor wrote:In T13514#237909, @clel wrote:Alright. Then I don't really understand what problems you have with Weblate. The things you wrote are too general for me to understand what the concrete problems are that you experience.
Sure. Just one thing that I do not understand is that people who do not really understand how the translation system works eagerly want to change that system.
When you write "Sure", I expect some more insights :) You wrote about problems you had but did not really give much detail about them. You talk about Weblate being much slower than offline tools while not mentioning which parts of the workflow you are talking about (admin stuff, translation itself, downloading and uploading PO files?).
All of them. There is no need for offline administration, translating every string through web interface is several times longer even in the zen mode, "downloading" new strings through Subversion then found what to translate in Lokalize takes ~5 seconds, analyzing big projects (Fedora is smaller than KDE now) in Weblate takes minutes. Uploading big files (libguestfs and its man, libvirt, Weblate docs, etc.) literally takes up to 10 minutes for just one file. I can imagine how long it would be to upload KStars, Krita and its docs (the last update required several dozens of files to be uploaded, the translation itself contains several hundred files), RKWard, KMyMoney or LabPlot.
Interesting. I only sporadically worked with online translation tools like Crowdin and Transifex, but never noticed any slow behaviour when translating strings. I assumed it would just be similar in speed to any offline tool. You write several times slower, that means a factor of 3 or higher? That only seems plausible for very specific tasks like searching for a wrongly translated String maybe that one wants to correct.
@yurchor I move the conversation here, since it seems to have nothing to do with the old task's topic anymore.
Aug 22 2020
In T13514#237943, @yurchor wrote:In T13514#237939, @clel wrote:In T13514#237910, @yurchor wrote:In T13514#237909, @clel wrote:Alright. Then I don't really understand what problems you have with Weblate. The things you wrote are too general for me to understand what the concrete problems are that you experience.
Sure. Just one thing that I do not understand is that people who do not really understand how the translation system works eagerly want to change that system.
When you write "Sure", I expect some more insights :) You wrote about problems you had but did not really give much detail about them. You talk about Weblate being much slower than offline tools while not mentioning which parts of the workflow you are talking about (admin stuff, translation itself, downloading and uploading PO files?).
All of them. There is no need for offline administration, translating every string through web interface is several times longer even in the zen mode, "downloading" new strings through Subversion then found what to translate in Lokalize takes ~5 seconds, analyzing big projects (Fedora is smaller than KDE now) in Weblate takes minutes. Uploading big files (libguestfs and its man, libvirt, Weblate docs, etc.) literally takes up to 10 minutes for just one file. I can imagine how long it would be to upload KStars, Krita and its docs (the last update required several dozens of files to be uploaded, the translation itself contains several hundred files), RKWard, KMyMoney or LabPlot.
In T13514#237941, @ltoscano wrote:I already wrote it: a central place is needed so that
Thanks for those points. As I said, if you already wrote that, I'd also been happy with just a link to that.
In T13514#237939, @clel wrote:In T13514#237910, @yurchor wrote:In T13514#237909, @clel wrote:Alright. Then I don't really understand what problems you have with Weblate. The things you wrote are too general for me to understand what the concrete problems are that you experience.
Sure. Just one thing that I do not understand is that people who do not really understand how the translation system works eagerly want to change that system.
When you write "Sure", I expect some more insights :) You wrote about problems you had but did not really give much detail about them. You talk about Weblate being much slower than offline tools while not mentioning which parts of the workflow you are talking about (admin stuff, translation itself, downloading and uploading PO files?).
I already wrote it: a central place is needed so that
- people *NOT* using weblate don't have to checkout tons of repositories to contribute. That's enough in itself.
- we may use posummit even with weblate to provide a single branch to everyone, which means that some logic to inject the translations to each branch will be needed somewhere else
- it will be the only interface that weblate would have to deal with (because in 5 years we may change again tool, and we don't lose the history)
- even in the case were part of the web tool would be the central place, still that would be the reference point, not the content of each repository which would be a mirror once we solve T12268.
In T13514#237910, @yurchor wrote:In T13514#237909, @clel wrote:Alright. Then I don't really understand what problems you have with Weblate. The things you wrote are too general for me to understand what the concrete problems are that you experience.
Sure. Just one thing that I do not understand is that people who do not really understand how the translation system works eagerly want to change that system.
Note: any procedure that downloads anything at compile time is broken by design. The git repo for any project should be complete in itself and contain everything that is needed to build a binary releasable artefact. Every time anyone builds a project from a repo, the result should be the same as a build from a source release archive.
Aug 21 2020
Because they want to change it into something they might have an actual chance of understanding?
In T13514#237909, @clel wrote:Alright. Then I don't really understand what problems you have with Weblate. The things you wrote are too general for me to understand what the concrete problems are that you experience.
Alright. Then I don't really understand what problems you have with Weblate. The things you wrote are too general for me to understand what the concrete problems are that you experience.
In T13514#237900, @clel wrote:In T13514#237897, @yurchor wrote:Weblate (as any online translation system) is still deadly slow compared to offline tools. On Hosted Weblate and Fedora's Weblate I witnessed constant git merge conflicts every week which should be resolved manually. This is a total nightmare for the KDE range of repositories number. Actually, I will have to prioritize some KDE translations because it is unrealistic to work with all of them through web interface (only ~15 minutes a day just to upload the translations, not saying about translating).
Weblate, Pootle, Wordbee, Transifex, Rosetta, etc. all tailored to leave several translations in a week or for *huge* teams.
Thanks for the insight. Maybe we should continue that on T11070 or T13311 to not hijack this task too much. I don't know the reasons behind those merge conflicts, so I cannot really judge the tools on that. Are you aware that you can download PO files from Weblate, use them in your offline workflow and upload them again?
In T13514#237897, @yurchor wrote:Weblate (as any online translation system) is still deadly slow compared to offline tools. On Hosted Weblate and Fedora's Weblate I witnessed constant git merge conflicts every week which should be resolved manually. This is a total nightmare for the KDE range of repositories number. Actually, I will have to prioritize some KDE translations because it is unrealistic to work with all of them through web interface (only ~15 minutes a day just to upload the translations, not saying about translating).
Weblate, Pootle, Wordbee, Transifex, Rosetta, etc. all tailored to leave several translations in a week or for *huge* teams.
In T11070#237890, @aacid wrote:Maybe we should let actual translators define the requirements of what they need/want?
In T13514#237894, @clel wrote:In T13514#237857, @ltoscano wrote:Is there some other reference where it is described, why this will make things "much harder for the translators"? Basically you would in fact have one central location for all files, but per project. Ideally those could get summarized at a different place like Weblate maybe, so there is still a fast overview of all projects and languages needing attention for example. Not sure whether that is supported by Weblate, though.
Not everyone will use weblate and we would have tons of repositories. It is not a path
The amount of repositories would not grow. The translations would just be incorporated and managed in the existing repositories for each project. The amount is also just as big as the already existing amount of projects. Can you elaborate what would be against using Weblate to manage those repositories? Maybe you can do that specifically in the task about evaluating an online translation tool to avoid duplication.
In T13311#237876, @aspotashev wrote:Shall we discuss how the system is going to handle multiple translation branches (e.g. trunk/l10n-kf5, branches/stable/l10n-kf5, ...) ?
Here are the obvious options:
- Use a web-based translation system that natively supports multiple "branches" for .po files. Does anyone know which of the listed systems have such support?
- Enable PO Summit for all team, then wire the web system up to the .po files gathered (unified/merged) from all branches.
- ...anything else?
In T13514#237857, @ltoscano wrote:Is there some other reference where it is described, why this will make things "much harder for the translators"? Basically you would in fact have one central location for all files, but per project. Ideally those could get summarized at a different place like Weblate maybe, so there is still a fast overview of all projects and languages needing attention for example. Not sure whether that is supported by Weblate, though.
Not everyone will use weblate and we would have tons of repositories. It is not a path
In T13311#237876, @aspotashev wrote:Shall we discuss how the system is going to handle multiple translation branches (e.g. trunk/l10n-kf5, branches/stable/l10n-kf5, ...) ?
Here are the obvious options:
- Use a web-based translation system that natively supports multiple "branches" for .po files. Does anyone know which of the listed systems have such support?
- Enable PO Summit for all team, then wire the web system up to the .po files gathered (unified/merged) from all branches.
- ...anything else?
In T11070#237886, @dkazakov wrote:In T11070#237855, @clel wrote:What modern tools are you talking about? Is this some online tool, some commandline tool or something else? Can you reference some example for such tool?
Well, I'm not a translator. I just wanted to add a requirement to the list.
In T11070#237886, @dkazakov wrote:Well, I'm not a translator. I just tried the first commercial tool google suggested me and it supported compilation. I don't know if weblate supports that (perhaps it does). I just wanted to add a requirement to the list. I believe, compilation of .mo files is almost a mandatory feature for the user. Installing gettext and compiling translation files on non-linux systems is not an acceptable solution.
In T11070#237855, @clel wrote:
In T11070#237775, @dkazakov wrote:
- the user should be able change translation of one string, test it in the application and send the result for integration/review in 15 minutes.
I also think there would be conflicts if somebody does a translation of a .po file, uploads it directly through SVN and at the same time translations for the same lines come from Weblate, correct?
Yes, it would
Aug 20 2020
Shall we discuss how the system is going to handle multiple translation branches (e.g. trunk/l10n-kf5, branches/stable/l10n-kf5, ...) ?
At the risk of this being off-topic, but are those conflicts also a big problem because of the way SVN handles merges, or would git be equally bad at it?
Is there some other reference where it is described, why this will make things "much harder for the translators"? Basically you would in fact have one central location for all files, but per project. Ideally those could get summarized at a different place like Weblate maybe, so there is still a fast overview of all projects and languages needing attention for example. Not sure whether that is supported by Weblate, though.
In T13514#237756, @rempt wrote:Um, yes, it is? Why else come up with "an other side of the argument"? In any case, problems with a practice we already support can hardly be relevant in discussing a practice we don't support yet?
Thanks for the insight. I also think there would be conflicts if somebody does a translation of a .po file, uploads it directly through SVN and at the same time translations for the same lines come from Weblate, correct?
In T4803#237771, @aspotashev wrote:@nalvarez, could you please update on the status of this ticket? Are you blocked on something? Do you need help?
Aug 19 2020
In T13514#237785, @dkazakov wrote:In T13514#237783, @ltoscano wrote:I know it may seems weird that I say this again, but this point is really out of scope for this specific task.
It is tracked by T13519Well, this task talks about "injection" as about decided fact. But I tried to understand why git and gitlab themselves cannot be used for that.
In T13514#237783, @ltoscano wrote:I know it may seems weird that I say this again, but this point is really out of scope for this specific task.
It is tracked by T13519
In T13514#237781, @dkazakov wrote:In T13514#237774, @pino wrote:Instead of suggesting solutions, please describe your requirements, or in general what you would like to see.
Well, there are two requirements:
- When releasing, setting a tag in the git repository should be enough to make a release. The tarball should be created automatically by gitlab's "releases" feature. Right now the scripts for making tarballs out of SVN break regularly, so every release we should fix it in one or another way. And these scripts are not fool-proof. We have generated and published incorrect tarballs several times.
- The developers should have an easy way to build/install translations. Preferably, these translations should be synced with the current branch/commit (I often switch between master and krita/4.3 branches).
I've also added a workflow requirement into a different task as you asked, but got a weird reply to it: https://phabricator.kde.org/T11070#237775
In T13514#237774, @pino wrote:Instead of suggesting solutions, please describe your requirements, or in general what you would like to see.
In T11070#237779, @ltoscano wrote:Sorry for the previous comment. This is the task about the online tool. Your requirement about editing is satisfied. About testing, we can create other tools to help the compilation/replacement of the po file, but if the injection of po files is implemented (that one is a different task), that's going to be easy.
Sorry for the previous comment. This is the task about the online tool. Your requirement about editing is satisfied. About testing, we can create other tools to help the compilation/replacement of the po file, but if the injection of po files is implemented (that one is a different task), that's going to be easy.
In T11070#237777, @ltoscano wrote:In T11070#237775, @dkazakov wrote:I think we should make a requirement for the updated workflow like that:
- the user should be able change translation of one string, test it in the application and send the result for integration/review in 15 minutes.
This is not related to this task.
In T11070#237775, @dkazakov wrote:I think we should make a requirement for the updated workflow like that:
- the user should be able change translation of one string, test it in the application and send the result for integration/review in 15 minutes.
I think we should make a requirement for the updated workflow like that:
In T13514#237772, @dkazakov wrote:Well, we can discuss the layout. I don't think that "one repo - one application" will make much download amount for the application translators. We can also split it in "one repo - one language for one application" manner. It might be too granular, but it might work and save time for people who work on all applications at the same time.
In T4803#237771, @aspotashev wrote:@nalvarez, could you please update on the status of this ticket? Are you blocked on something? Do you need help?
In T13514#237770, @pino wrote:
@nalvarez, could you please update on the status of this ticket? Are you blocked on something? Do you need help?
In T13514#237769, @dkazakov wrote:I don't really understand why cannot we use submodules for that? It looks like a submodule can track external branches. Why cannot we just add a submodule into each KDE project that fetches translations from an external repo by tracking some specific branch?
I don't really understand why cannot we use submodules for that? It looks like a submodule can track external branches. Why cannot we just add a submodule into each KDE project that fetches translations from an external repo by tracking some specific branch?
Boud, this is not the proper way to move this forward. Nothing in what Albert wrote is about not doing this.
In T13514#237738, @rempt wrote:No, it's not irrelevant. It doesn't matter that .po are only copied, if the translation in stable branch and the one in development branch diverge (which they eventually will), you'll get merge conflicts (when merging stable to development)
Yes, it is irrelevant. I've never seen anyone merge an entire stable branch in one merge commit into the unstable branch; normally, people cherry-pick patches. Just don't cherry-pick the po files.
In any case, the advantages of having a po folder in the project git repo outweighs all of that to me, so if you are so intent on blocking this with irrelevant arguments, well, I'll do it myself.
No, it's not irrelevant. It doesn't matter that .po are only copied, if the translation in stable branch and the one in development branch diverge (which they eventually will), you'll get merge conflicts (when merging stable to development)
In T13514#237735, @rempt wrote:Just to show the other side of the argument, i've had lots of developers asking for scripty not to commit to the repo the .desktop translations back to the repos since it creates merge conflicts for them.
That is not "another side of the argument". It's irrelevant, because .desktop files are edited by developers, po files would only be copied in.
Just to show the other side of the argument, i've had lots of developers asking for scripty not to commit to the repo the .desktop translations back to the repos since it creates merge conflicts for them.
In T13514#237706, @rempt wrote:In T13514#237638, @huftis wrote:This will make things easier for developers, packagers and user’s who like to compile applications themselves (instead of using packages, or just for testing a new feature or a bug fix).
And it follows the most common practice across the free software world, too, so there's familiarity for contributors, too.
But having a non-central location of the PO files will make things much harder for the translators (for various reasons that I won’t get into now).
May we have the best of both worlds? That is, have a central repository for translations, for use by the translators, and automatic copying of the translations into each application’s repository by scripty. Basically the same thing that happens with translations in .desktop files. They are translated centrally, but any updates to the .po files are merged into the .desktop files in each application’s repository by scripty each night. Having a similar thing be done with the .po files would be nice. Plain .po files would be copied directly, while other formats (e.g., .ts files used by some Qt applications) would be converted from the .po files.
And to avoid anyone manually editing the ‘.po’ files in a an applications repository, perhaps a pre-commit hook could be added so that only scripty is allowed to commit changes to the files.
Well, that would be a huge improvement for sure.
That GSoC proposal didn't go through. My initial suggestion was to replace the SVN localization with an online tool, this would mean straight commit to SVN won't be allowed. But with further communication, learning and setup, I realized that this have cons too. So, I changed it from entire replacement, to side-by-side system.
Thanks @subins2000 for adding a report of your experience and also some documentation. I added this information to the task description.
In T13514#237694, @woltherav wrote:Thanks! Maybe we should make this a subtask of that as well?
EDIT: especially because we're literally having duplicate conversations in both tasks :D