Seeing how people struggle with salvaging the old wiki of Ugha and setting up a new wiki with MediaWiki, I have been thinking about alternatives. One such alternative could be federated wikis, which were started by the wiki-inventor Ward Cunningham. This is how I started to get one running over I2P, and why, first of all.

From the wiki on Ward Cunningham’s repo at GitHub:

I reported at the Indie Web Camp that one of my most dramatic mistakes was making wiki a centralized service.

So it’s not our fault, it’s his ;) No, but pretty interesting statement. And yeah, centralised is what we not want actually. But what did he come up with then? How could a Wikipedia be distributed? Well, he came up with a wiki system that is composed of federated nodes. You can visit wikis at various addresses, but as I refuse to use Javascript from foreign servers, they never worked for me. So I thought having an own server would allow me to check things out a bit closer … Installation is supposed to be done via npm, so I needed that first:

root@host:~# apt-get install npm
[...]
root@host:~# npm install -g wiki
[...]

The install went to /usr/local/lib/node_modules/wiki here, but the installation issued a whole lot of warnings. Mostly because of optional dependencies. On some a reference to /usr/share/doc/nodejs/README.Debian was appended, so I checked that:

Scripts calling Node.js as a shell command must be changed to instead use the “nodejs” command.

So Debian has some specials. I was able to workaround this by changing the first line of this configuration file:

root@host:~# vim /usr/local/lib/node_modules/wiki/index.js
#!/usr/bin/env nodejs

I meanwhile had opened an issue on this at GitHub (link at bottom). There sudo ln -s /usr/bin/nodejs /usr/bin/node was pointed out as a system-wide workaround on this, which probably is the better option.

So having the node running I can access it at localhost:3000 now. These wiki pages are based on js and not everything is readable without enabling js at the moment. This seems to be depending on how one authors a page, though. But at least on my local pages where I can control what js goes in that can be acceptable. But having to activate js just for viewing a side is what I consider bad design usually. I opened another issue on this topic on GitHub (see links at bottom).

But creating a tunnel in the router to the port of FedWiki makes that wiki available in I2P initially. I assume there will be things to fix in the background. FedWiki uses Federation, i.e. contacts other nodes. Somehow. I haven’t quite found out how federation works, yet. There might also be things exposing local IP and alike, which I need to check.

Running a wiki there are configuration options. Calling the program with --help provides the following overview:

user@host:~$ wiki -h
Usage: /usr/local/bin/wiki

Options:
  --url, -u         Important: Your server URL, used as Persona audience during verification                   
  --port, -p        Port                                                                                       
  --data, -d        location of flat file data                                                                 
  --root, -r        Application root folder                                                                    
  --farm, -f        Turn on the farm?                                                                          
  --home            The page to go to instead of index.html                                                    
  --host, -o        Host to accept connections on, falsy == any                                                
  --id              Set the location of the open id file                                                       
  --database        JSON object for database config                                                            
  --neighbors       comma separated list of neighbor sites to seed                                             
  --autoseed        Seed all sites in a farm to each other site in the farm.                                   
  --allowed         comma separated list of allowed host names for farm mode.                                  
  --uploadLimit     Set the upload size limit, limits the size page content items, and pages that can be forked
  --test            Set server to work with the rspec integration tests                                        
  --help, -h        Show this help info and exit                                                               
  --config, --conf  Optional config file.                                                                      
  --version, -v     Optional config file.           

So I need to supply the url the wiki should use, like this:

user@host:~$ wiki --url http://mygeneratedb32tunneladdress.b32.i2p/ 

I watched the console output and checked what the browser is loading there. The browser tries to load 10 Javascript files, though, which got initially blocked. These include at least one external link, due to the login-system supported - login.persona.org:

<script type="text/javascript" src="/js/jquery-1.11.3.min.js"></script>
<script type="text/javascript" src="/js/jquery-migrate-1.2.1.min.js"></script>
<script type="text/javascript" src="/js/jquery-ui/1.11.4/jquery-ui.min.js"></script>
<script type="text/javascript" src="/js/jquery.ui.touch-punch.min.js"></script>
<script type="text/javascript" src="/js/modernizr.custom.98077.js"></script>
<script type="text/javascript" src="/js/underscore-min.js"></script>
<script src="https://login.persona.org/include.js"></script>
<script type="text/javascript" src="/client.js"></script>

Another piece of script is contained within the body part:

<script>
    var loggedInUser =
        null;
      var seedNeighbors =
</script>

I read about Reclaimhosting on GitHub doing some hosting where anyone can create or better claim a created wiki. It seems those are then all showing up in the according GitHub repository. Interesting approach that might be worth a closer look, too. That could be adoptable to I2P rather easily. If one does not oppose the Javascript. Ward gave the hint to “temper with the identity file” to locally make a site claimed and thus editable:

user@host:~$ echo "NoLoginAllowed:Tlkhtga8-qH.S#" > .wiki/status/open_id.identity

During startup after that I still see the following line as before:

id: '/home/user/.wiki/status/persona.identity',

But with activated js the claim-thingy is gone and I can edit the pages on the server. Now in this mode everyone able to reach the server can freely edit the pages, which is not what I currently want. So I closed the i2p tunnel again for now. These wikis won’t work without the identification service it seems, which currently is based on Mozilla Persona. They used OpenID before, which is nicely supported under Debian by simpleid. But I see more js coming with the login service, so from that perspective it looks a rather bad choice. From all other perspectives it looks like a very cool project, at least at the first glance. Will need to think about this some more ;)

But here some more findings about how these wikis are supposed to work. Besides the servers, each client (i.e. the javascript in the browser) stores pages when you edit them without being able to “write” to the server. A bit of a problem is how to notice this. Because to get these changes to a server you would need to do a “change request” manually. So this sounds rather cumbersome for a first time user let’s say. I’m actually still not sure how this is done. In the discussion on GitHub it was explained to me, that

The federation is built in the client, creating a neighbourhood as you navigate, rather than in the server. While loading each page the client scans it looking for references to other sites - these might in the page’s history, where page has been forked from, or using the reference plugin - the sitemap.json for each of these sites is loaded to create the local neighbourhood for the current session. It is this local neighbourhood that forms the area the client will look when resolving links for a page.

So it seems like a user with privileges on the server needs to a) notice these changes and then b) pull those in to “propagate” changes to it. In terms of keeping a copy of an i2p wiki updated this at first does not sound like a good choice. On the other hand, if OpenID is used, the history included in each fork also preserves the identities, which is one major problem a any current forum/wiki. So basing a system solely on OpenID (which we can probably easily host for our own) is maybe not a bad idea to start with. But it would need to be much easier for beginners to get started with contributing and I doubt if it is sensible to use Javascript in that way. Especially in CryptNets but actually rather in general.


ClearNet Links:

  • https://en.wikipedia.org/wiki/Ward_Cunningham
  • https://github.com/WardCunningham/Smallest-Federated-Wiki
  • https://github.com/fedwiki/wiki
  • https://www.npmjs.com/package/wiki
  • https://github.com/fedwiki/wiki/issues/61
  • https://github.com/fedwiki/wiki/issues/62
  • https://github.com/reclaimhosting/federated-wiki