DWiki's configuration file has a simple format. Blank lines and
comments (any line that has a '#
' as the first non-whitespace
character) are just skipped, and everything else is interpreted as a
configuration directive to set. Directives can be continued with
additional lines by starting the continued lines with whitespace
(as in email headers). The continuation whitespace will be turned
into a single space in the final, un-continued version of the line.
Configuration directives have optional values, which are separated from the configuration item by whitespace. (Whitespace within the value is not interpreted, although trailing whitespace is removed from lines.)
So an example set of configuration file lines might be:
root /web/data/dwiki pagedir pages tmpldir templates wikiname TestWiki wikititle Testing Wiki
DWiki requires and uses some configuration directives. Unused
configuration directives are not errors; all configuration directives
(and their values) become part of the context variables available for
template ${...}
expansion.
To simplify life, configuration directives are put through a canonicalization process. This operates like so:
pages
, templates
, or
rcsroot
exist under root and if so sets up the configuration
directives appropriately.Required configuration directives are: pagedir
, tmpldir
,
wikiname
, and rooturl
. This means that with defaulting, the
minimal DWiki configuration file is:
root /some/where rooturl /some/thing wikiname SomeThing
root
pagedir
tmpldir
pagedir
.)usercs
usercs
is set, DWiki
refuses to serve files ending with ,v
or in RCS directories; see
InvalidPageNames. As a result, setting usercs
is only necessary
if you want page history et al to be visible to people visiting
the DWiki; you can use RCS yourself on page files without setting it.rcsroot
usercs
is on. RCS
directories under
pagedir
, where basic RCS commands put them (if you make those
directories; DWiki requires you to work this way). With this
directive on, the RCS ,v
files for files
under pagedir
are instead found under here, in a mirror of the
directory structure in pagedir
, so you have pagedir/foo/bar
and rcsdir/foo/bar,v
. This keeps pagedir
neater at the expense
of requiring some scripting support.wikiname
wikititle
wikiroot
wikiname
's value as a page name; if that
doesn't work, people see the DWiki's root directory in a directory
view.rooturl
publicurl
rooturl
.staticdir
staticurl
staticurl
doesn't start with a slash, it's taken as a
subdirectory of rooturl
. (Requires staticdir
to be set.)charset
cssurlprefix
html/css
template as one option
for where to find DWiki's standard CSS file, dwiki.css
. If this is
set it's the URL of a directory (without a trailing slash). If this
is not set, the html/css
template assumes that dwiki.css
can be
found at ${staticurl}/dwiki.css
. It's more efficient to serve
dwiki.css
outside of DWiki itself, since it's a static file.Note that various parts of DWikiText rendering do not look right if the CSS is missing (in particular, all sorts of tables are likely to look bad).
When DWiki gets a request for a URL, it tries to turn it into a
request for something under either staticurl
(if defined) or
rooturl
; whatever is left after subtracting the appropriate thing is
the path being served relative to staticdir
or pagedir
.
staticurl
is checked first, so it can be a subset of the URL space
available under rooturl
.
For safety reasons, DWiki only tries to process a request if the
request's URL falls under either staticurl
or rooturl
. If DWiki
receives a request for anything outside those two, something is
clearly wrong and it generates a terse error page.
When it generates URLs for DWiki pages DWiki normally puts rooturl
on front (as a directory). However, if you set publicurl
DWiki puts
that on the front instead.
This is useful if for internal reasons you receive requests with their URLs rewritten to something users shouldn't (or can't) use. The case ChrisSiebenmann knows is Apache with URL aliases and the DWiki CGI-BIN being run via suexec.
See Authentication for more information on the authentication system.
authfile
defaultuser
authfile
. This should be
used carefully, as it makes all requests to the DWiki be
authenticated (since they all have a user, if even only the default
user). If this is set, the username it is set to is said to be
the 'guest user'.global-authseed
global-authseed-file
global-authseed
from, if it is set. The file has no special format, but should
contain some randomness and its contents should be kept secret.authcookie-path
logins-report-bad
commentsdir
comments-on
commentsdir
be defined and that authentication be enabled.comments-in-normal
remap-normal-to-showcomments
comments-in-normal
without you needing to change the standard
templates.If you want to enable anonymous comments you should create a guest
user in the DWiki authfile
and then set guest
as the defaultuser
.
(Well, you can use the username of your choice, but guest
is
conventional.)
DWiki can optionally cache the results of page generation to speed up response time. See Caching for a longer discussion.
cachedir
pagedir
, tmpldir
, or commentsdir
). DWiki will write scratch
files to here.cache-warn-errors
render-cache
cachedir
to be set.)render-heuristic-ttl
render-anonymous-only
render-heuristic-flagged-ttl
render-heuristic-flagged-delta
bfc-cache-ttl
cachedir
to be set.)bfc-time-min
bfc-load-min
bfc-time-triv
bfc-load-min
, don't
bother looking at the load average if the page took at most this
long to generate. Defaults to 0.09 of a second.bfc-atom-ttl
bfc-atom-nocond-ttl
bfc-skip-robots
bad-robots
, see later) for robots that should
not cause entries to be put into the BFC.imc-cache-entries
dwiki-scgi.py
as a preforking SCGI server.imc-force-on
imc-cache-ttl
imc-cache-entries
is.imc-resp-max-size
slow-requests-by
In practice some degree of caching is mandatory for decent performance
once your DWiki gets big enough and so it's recommended that you turn
on render-cache
and bfc-cache-ttl
unless you have a good reason to
do otherwise. Turn on imc-cache-entries
and imc-cache-ttl
if you're
using SCGI.
atomfeed-display-howmany
atom::pages
and atom::comments
use a default of 100 items.feed-max-size
atom::pages
or
atom::comments
should try to limit their output to. If set, either
stops adding new entries (regardless of how many entries have been
processed already) once they have generated that many kilobytes or
more of output. Because of the 'or more' clause, you should allow
for a safety margin. If unset, syndication feeds are not
size-limited.feed-max-size-ips
66.150.15.
'), or IPv4 CIDRs (eg '66.150.15.0/25
') that
feed-max-size
applies to. Syndication requests from any other
addresses are not size-limited. If unset, feed-max-size
applies to all
syndication requests, regardless of what IP address makes the request.
This option can be specified multiple times; if so, all the addresses
are merged together.feed-start-time
atomfeed-tag
).
The value can be specified either as an integer Unix timestamp,
as 'YEAR-MO-DA [HH:MM[:SS]]', 'YEAR/MO/DA', or an Atom format time
string, and is always in local time (even when specified as an
Atom format time string; sorry).atomfeed-tag
atom::pagetag
renderer will use it
to generate Atom <id>s for pages in the format <tag>:/<page path>.
This should normally be set to a tag:
-based URI; see
here
for a discussion.atomfeed-tag-time
atom::pagetag
renderer will only
generate tag-formatted Atom <id>s for pages more recent than this
time. This can be used to make a graceful transition into
tag-based Atom <id>s for an existing DWiki (and then, with
feed-start-time
, to graceful move it). This has the same time
format as feed-start-time
.
atomfeed-virt-only-adv
latest
, oldest
, range
,
calendar
, and the calendar
subtypes year
, month
, and day
.atomfeed-virt-only-in
latest
or range
feed
request is (permanently) redirected to the real directory's feed;
other disallowed feeds get 404 responses. The format and list of vdir
types is the same as for atomfeed-virt-only-adv
. If this is set,
it becomes atomfeed-virt-only-in
's default value. If both are set,
this should be a superset of atomfeed-virt-only-adv
's value; otherwise
DWiki will advertise feeds that it will refuse requests for.
You should normally allow feeds for latest
because this gives
people a way of controlling how large a feed they pull from you;
they can use, eg, 'blog/latest/10/?atom' to pull only a ten-entry
feed instead of your full-sized feed.
These two directives don't change or affect what Atom comment feeds are advertised or allowed; they affect only Atom feeds for pages.
alias-path
Aliases
.search-on
blog-display-howmany
blog::blog
renderer
should try to restrict most pages it displays to. If set, it must be
a positive integer; if not set, blog::blog
uses a default.
canon-hosts
Host:
header that is
not in this list, DWiki immediately serves up a redirection to the
first hostname in the list (or canon-host-url
, if that is set),
which is assumed to be the preferred hostname.canon-host-url
/
, but including http or
https and the port if necessary). DWiki will generate redirects
and absolute URLs that use this URL. If canon-hosts
is also
set, this should be the full version of the first entry in
canon-hosts
.(This is primarily useful in some hopefully unusual situations involving HTTP-to-HTTPS transitions.)
literal-words
|
' (space, |, space), that will be rendered literally and not
considered to contain markup, as if each of them had been specified
in '.pn lit <whatever>
' processing note directives.dump-req-times
-T
option.dump-atom-reqs
-A
option.stamp-messages
--stamp
option.These are documented because you might want to set them directly if you're running DWiki as a WSGI application inside some standard WSGI server (such as uWSGI, Apache's mod_wsgi, or gUnicorn).
bad-robots
|
' (space, |, space), for robots that should get
permission denied responses when they try to fetch pages in various
views that no robot should be fetching. Currently the list of bad
views is atom, atomcomments, source, and writecomment, all of which
are typically fetched by robots that don't respect rel="nofollow"
on links.no-ua-is-bad-robot
banned-robots
bad-robots
) for robots that should get permission
denied responses on all requests.banned-ips
feed-max-size-ips
) for addresses
that will get access denied responses for all requests. It can be
specified multiple times.banned-comment-ips
banned-ips
but only applies
to attempts to write comments.bad-robot-ips
banned-ips
but only applies to
requests that try to fetch pages in various views that no robot should
be fetching (as in bad-robots
).Under normal circumstances it's more efficient to use your web server's
access controls to totally ban IP addresses and bad user-agents; your
web server usually has faster code for this and you don't have to get
DWiki involved in the process. banned-robots
and banned-ips
exist
because this is not always possible.