H&D website v2 working notes: Difference between revisions

From Hackers & Designers
mNo edit summary
No edit summary
Line 2: Line 2:
|MainNavigation=No
|MainNavigation=No
}}
}}
[[Category:Working Notes]]
''wip how to structure notes:''  
''wip how to structure notes:''  
* do we need dates? could we use article's changes timestamp and diff to get this?
* do we need dates? could we use article's changes timestamp and diff to get this?
Line 16: Line 19:
A better, more fitting option though, is to use [https://www.mediawiki.org/wiki/API:Recent_changes_stream Recent changes stream] APIs, which just does this: listen to any change for any page, and send that event to some other server.
A better, more fitting option though, is to use [https://www.mediawiki.org/wiki/API:Recent_changes_stream Recent changes stream] APIs, which just does this: listen to any change for any page, and send that event to some other server.


[[Category:Working Notes]]
The [https://www.mediawiki.org/wiki/Manual:$wgRCFeeds $wgRCFeeds] handles the setup for this, and once the python app runs as a server, we can either use UDP or Redis to subscribe to this stream.
The [https://www.mediawiki.org/wiki/Manual:$wgRCFeeds $wgRCFeeds] handles the setup for this, and once the python app runs as a server, we can either use UDP or Redis to subscribe to this stream.


Line 23: Line 24:


H 10-12: Great idea. Static site would also greatly speed up the site, so that's a nice bonus of this workflow. I would like to note that adding a page would potentially affect a lot of other pages through navigation etc. So it would probably be more effective to regenerate the whole site on changes (or at least when a new page is added). We'd have to implement some sort of debouncing to prevent doing that too often.
H 10-12: Great idea. Static site would also greatly speed up the site, so that's a nice bonus of this workflow. I would like to note that adding a page would potentially affect a lot of other pages through navigation etc. So it would probably be more effective to regenerate the whole site on changes (or at least when a new page is added). We'd have to implement some sort of debouncing to prevent doing that too often.
A 2020-12-10: Yes good point. Would make sense to list down when possible problems could arise and figure how to fix these. Either by partial or full rebuild.

Revision as of 14:59, 10 December 2020

MainNavigation No

wip how to structure notes:

  • do we need dates? could we use article's changes timestamp and diff to get this?
  • maybe better organize around software "feature" / areas of work?

2020-12-09

Recent changes stream

As part of the new website implementation, an idea would be to send to the python frontend application a “message“, whenever a change has been made to the wiki. The reason for this, is to use the python app to fetch the page which has been modified, and save it to disk as a static HTML file. All in all, the python app should act as a bridge between the dynamic wiki content and the static output that we use to serve the final website.

A first idea was to make a user bot, that would subscribe to every page in the wiki, would follow their changes, and once a page would change, a notification was sent to the bot (unclear if in an internal message format or through email). After this happened, the python app would use this email to fetch the latest version of the changed article, and lastly would produced a static version of that page.

A better, more fitting option though, is to use Recent changes stream APIs, which just does this: listen to any change for any page, and send that event to some other server.

The $wgRCFeeds handles the setup for this, and once the python app runs as a server, we can either use UDP or Redis to subscribe to this stream.

Some work needs to be done to figure out how this must be setup, but should be fairly ‘standard’ procedures.

H 10-12: Great idea. Static site would also greatly speed up the site, so that's a nice bonus of this workflow. I would like to note that adding a page would potentially affect a lot of other pages through navigation etc. So it would probably be more effective to regenerate the whole site on changes (or at least when a new page is added). We'd have to implement some sort of debouncing to prevent doing that too often.

A 2020-12-10: Yes good point. Would make sense to list down when possible problems could arise and figure how to fix these. Either by partial or full rebuild.