[ot][spam][crazy] coding therapy: maybe a bugfix?
i've gotten conscious that some coding projects can be relaxing to me again i was thinking of looking on savannah for something i could implement
this looks like it might be up my alley: http://savannah.nongnu.org/people/viewjob.php?group_id=11906&job_id=667 "edit perl script to add internationalization for welcome-a-bit for Wikinews" it's not part of GNU, but i'll ignore that for now. i remember perl a smidge.
notes on welcome-a-bit job
This project is designed to encourage involvement of people with newly submitted articles at Wikinews by sending them instant notifications of new submissions in their topics of interest.
Purpose: give encouragement to new submitters Method: notify them of other new submissions in categories they have expressed value for
To sign up, add your name to the page https://en.wikinews.org/wiki/User:Gryllida/welcome_a_bit . [visited this link:] To encourage timely completion of new submissions, you can help by 'welcoming' each new article 'a bit'. A welcome is - instant (or nearly instant) - encouraging - inviting to work together (and if they are asking for help, then keep this promise :) ) To offer your help with the code, click 'source code' below.
It's a user page with an interactive interface. It provides a "sign up button" that opts the visitor in for these notifications.
Signing up allows you to receive instant notifications of new drafts submissions by email or on your personal talk page on wiki.
The notifications are of new drafts submissions. The notifications may be received either by e-mail or on your on-wiki page.
This tool is being maintained by Gryllida who does the coding and documentation. If the are any issues or suggestions, just ask at the talk page: Click here to leave a message.
There's a special link to communicate with Gryllida who is developing this. [the project is called "welcome-a-bit"]
Source code: https://savannah.nongnu.org/projects/weabit/
The development portal is https://savannah.nongnu.org/projects/weabit/ . Back to the job description. It said here "expected to be completed in fully in February 2019". It may be stagnant. I'll ignore that for now. I just want a project.
Information for helpers: [https] User:Gryllida/welcome a bit/for helpers Mastodon (the free federated Twitter alternative): [https] @gry@masthead.social
Back to the job description.
Currently we support delivery of notifications by e-mail. We may deliver the notifications to wiki pages, Android push notifications, or IRC in the future; if you are interested in any of these methods please send me a message.
Additional work is available. I'll focus on what's just in front of me.
This project is in the beta stage. You are encouraged to test it and share your feedback. To report any bugs or suggestions, please leave a message using any of the below venues: - open an issue using the 'issues' link at the top - query live chat at #wikinews at freenode - leave a message on-wiki at https://en.wikinews.org/wiki/User_talk:Gryllida/welcome_a_bit
They have an irc channel on freenode (which may have moved to libera.chat). They use the wikinews channel, so I can find the wikinews irc channel to discuss this project / job. Next in the description is a draft perl script.
review of job title: There is a perl script, and the goal is to edit it to add internationalization. Notes on draft perl script > copyright line. the author is an FSF member, celebrations > GNU GPL license block > # 1. read in a list of rss feeds and usernames > # 2. It has an incomplete functionality list containing 1 item. The item is to read in a list of rss feeds and usernames. > use strict; > ... It imports a handful of perl packages, including: - mediawiki API - www client - perl value dumper "wabget" is commented out. > my $mw = MediaWiki::API->new(); > my $wikiURL = 'https://en.wikinews.org'; # /w/api.php'; > $mw->{config}->{api_url} = "$wikiURL/w/api.php"; It's talking with wikinews via the mediawiki perl api. wikinews is a mediawiki site. > while(1){ The rest of the script is inside an infinite loop. It runs like a daemon.
Skipping laboriously parsing the entirety of the script right now, there is a job description below it.
For https://savannah.nongnu.org/projects/weabit/ we currently only support English Wikinews and the English language. It would be great if you could volunteer with adding internationalization in the text of the e-mails, and add support for other language editions of Wikinews - the relevant languages and category names are listed in the sidebar of en.wikinews.org. Please visit the home page linked in the first sentence to learn how to contact me and leave your feedback.
Okay, that doesn't sound too hard !!! I don't need to understand the whole script, I would just need to: - add internationalization to the text of the emails - add support for other language editions of wikinews [listed in sidebar of en.wikinews.org] These are two separate tasks that will engage different parts of the script and its surrounding software ecosystems in different ways.
Here's the sources repository: http://cvs.savannah.nongnu.org/viewvc/weabit/weabit/ I wonder; is one of these files equivalent to the draft published on the project page? Or what am I looking at?
yeah here it is: http://cvs.savannah.nongnu.org/viewvc/weabit/weabit/wabv2.pl?revision=1.2&view=markup I want to see if i can make a cvs git remote to use git to manage changes. Maybe I can try this: https://github.com/rcls/crap $ crap-clone :pserver:anonymous@cvs.savannah.nongnu.org:/sources/weabit weabit seems to work NOTE: crap-clone says on their website [readme file] to use git2cvs instead. oops. I'll stick with what i did for now. okay: - add internationalization to emails - add support for non-english wikinews hubs - the file is wabv2.pl thoughts: - I think this perl script is the backend server. I'm imagining either multiple processes running at once in parallel, one for each language, which might help isolate crashes a little when encountered, but can make systems more confusing to troubleshoot, or a single process that iterates over all the languages. - maybe it could be fun to procedurally retrieve all the languages The languages are listed at https://www.wikidata.org/wiki/Q5296#sitelinks-wikinews . There are 34 of them. Pretty impressive !!!! There are languages in this list I don't recognise at all, as well as languages I do recognise as possibly being held with conflict in my culture. I think I'll try to retrieve the language list.
I learned a little about SPARQL, which is wikidata's query language. It has a quite pleasant setup, involving very few basic operators and high flexibility. You can query random data from wikidata, like who bach's children are. This query returns the different wikinews language api urls: https://w.wiki/5$k3 SELECT ?result WHERE { # api-endpoint instance-of language edition of wikinews [ wdt:P6269 ?result ] wdt:P31 wd:Q20671729. } :)
SELECT ?code ?url WHERE { # lang-code api-endpoint instance-of language edition of wikinews [ wdt:P424 ?code; wdt:P6269 ?url ] wdt:P31 wd:Q20671729. } let's see if I can get that into RDF::Query, a perl module that should be able to do this. I'm copying basic snippets from https://metacpan.org/release/VOJ/App-wdq-0.4.4/source/script/wdq to learn the use. .. okay it turns out wikidata.org actually can _automatically generate perl code_. i can copy from its generated code. okay here's what I have: use LWP::Simple; use JSON; use Data::Dumper; # P424=language code; P6269=api url # P31=instance of Q20671729=wikinews language my $query = <<EOF; SELECT ?code ?url WHERE { [ wdt:P424 ?code; wdt:P6269 ?url ] wdt:P31 wd:Q20671729. } EOF $result_str = get "https://query.wikidata.org/sparql?format=json&query=${query}"; $result_list = decode_json($result_str)->{results}{bindings}; %code_urls = map { $_->{code}{value} => $_->{url}{value} } @$result_list; print Dumper \%code_urls; It outputs like this: $VAR1 = { 'uk' => 'https://uk.wikinews.org/w/api.php', 'pt' => 'https://pt.wikinews.org/w/api.php', 'tr' => 'https://tr.wikinews.org/w/api.php', 'es' => 'https://es.wikinews.org/w/api.php', 'ru' => 'https://ru.wikinews.org/w/api.php', 'no' => 'https://no.wikinews.org/w/api.php', 'fr' => 'https://fr.wikinews.org/w/api.php', 'ro' => 'https://ro.wikinews.org/w/api.php', yay ! i guess i need some error handling in there ... although the script to edit also had no error handling :S this seems really unfortunate as errors happen reliably in my world. maybe i'll leave it out for now.
Summaries: # Task Weabit, https://en.wikinews.org/wiki/User:Gryllida/welcome_a_bit , notifies people when pages are updated. The task, http://savannah.nongnu.org/people/viewjob.php?group_id=11906&job_id=667 , is to: - add support for non-english wikinews - add localization to email texts # Starting Source
The script is under :pserver:anonymous@cvs.savannah.nongnu.org:/sources/weabit weabit . It's called weabv2.pl , and is a very simple daemon script with no error checking. # New Code to enumerate languages use LWP::Simple; use JSON; use Data::Dumper; # P424=language code; P6269=api url # P31=instance of Q20671729=wikinews language my $query = <<EOF; SELECT ?code ?url WHERE { [ wdt:P424 ?code; wdt:P6269 ?url ] wdt:P31 wd:Q20671729. } EOF $result_str = get "https://query.wikidata.org/sparql?format=json&query=${query}"; $result_list = decode_json($result_str)->{results}{bindings}; %code_urls = map { $_->{code}{value} => $_->{url}{value} } @$result_list; print Dumper \%code_urls;
participants (1)
-
Undescribed Horrific Abuse, One Victim & Survivor of Many