H&D website v2 working notes: Difference between revisions

From Hackers & Designers
No edit summary
No edit summary
Line 6: Line 6:
= Active Working Notes =
= Active Working Notes =


''wip how to structure notes:''
== wip how to structure notes ==
 
* do we need dates? could we use article's changes timestamp and diff to get this?
* do we need dates? could we use article's changes timestamp and diff to get this?
* maybe better organize around software "feature" / areas of work?
* maybe better organize around software "feature" / areas of work?

Revision as of 22:49, 16 December 2020

MainNavigation No

Active Working Notes

wip how to structure notes

  • do we need dates? could we use article's changes timestamp and diff to get this?
  • maybe better organize around software "feature" / areas of work?

Recent changes stream

As part of the new website implementation, an idea would be to send to the python frontend application a “message“, whenever a change has been made to the wiki. The reason for this, is to use the python app to fetch the page which has been modified, and save it to disk as a static HTML file. All in all, the python app should act as a bridge between the dynamic wiki content and the static output that we use to serve the final website.

A first idea was to make a user bot, that would subscribe to every page in the wiki, would follow their changes, and once a page would change, a notification was sent to the bot (unclear if in an internal message format or through email). After this happened, the python app would use this email to fetch the latest version of the changed article, and lastly would produced a static version of that page.

A better, more fitting option though, is to use Recent changes stream APIs, which just does this: listen to any change for any page, and send that event to some other server.

The $wgRCFeeds handles the setup for this, and once the python app runs as a server, we can either use UDP or Redis to subscribe to this stream.

Some work needs to be done to figure out how this must be setup, but should be fairly ‘standard’ procedures.

H 2020-12-10: Great idea. Static site would also greatly speed up the site, so that's a nice bonus of this workflow. I would like to note that adding a page would potentially affect a lot of other pages through navigation etc. So it would probably be more effective to regenerate the whole site on changes (or at least when a new page is added). We'd have to implement some sort of debouncing to prevent doing that too often.

A 2020-12-10: Yes good point. Would make sense to list down when possible problems could arise and figure how to fix these. Either by partial or full rebuild.

H 2020-12-11: Just an idea, but one option could be to use javascript to progressively enhance the navigation. So each page has a baked-in navigation,but checks on load against a (json?)document if the navigation has updated since creation and updates navigation menu's accordingly. If the page is eventually updated it will get the updated navigation as well of course. You could have a script that recreates all pages daily (or other timeframe) at midnight if there are changes since the last run.

Web Analytics / GDPR

we are tracking users, and not telling that in any way. this is in part due to me (André) not knowing enough about GDPR, and in part due to the fact that Matomo allows for anonymized data collection. Still, we should inform the user we are doing this.

i would maybe tell the user the truth: we are only tracking the number of users visiting the website for the purpose of sharing this info in the End of the Year report we do for the funding institution.