It seems like they actually support it in Wiki.js however it requires you to first click "insert assets" and once that modal displays [0], you can actually paste into the page and it will be uploaded.
Not too different than JIRA really. But I feel like this feature would be improved if pasting into the editor itself yielded the same result.
"You can upload images from a tab in the media dialog, or by dragging and dropping a file into the editor, or by pasting an image from your clipboard. [...] The image will be inserted into the page when you are done."
That said, this is about to change! The upcoming Mediawiki 1.35 is supposed to move Parsoid into core PHP, and so VisualEditor is going to become a lot more default-accessible. :D
I have been waiting for an online wiki with the usability of Apple Notes that I use locally on all my Apple devices. It works like a charm except that I cannot make it public.
This is why I often take notes rather than write blog posts on my website. If the wiki software was as easy drag and drop as Apple Notes, I’d just take notes and they turn into publicly available wikis!
I am yet to find that tool. I would happily pay for such a tool with one braking condition that it must be self-hostable. I will not write my content into something like Medium or Notion where I don’t own my content.
https://www.dokuwiki.org/plugin:imgpaste https://github.com/cosmocode/dokuwiki-plugin-imgpaste
Is this the type of functionality you're looking for if it could be self-hosted? How much would you be willing to pay for such a tool?
- ui: joplin. - agpl, fully self-hostable. - you own your content (because joplin). - choose among free templates, or create your own. - templates will be similar or compatible to hugo, still tbd.
Optional for paying customers:
- sync via webdav to my service. - custom domain. - backups, etc.
Once this starts generating money, I am planning to spend some of it to fund e2e per-folder encryption in joplin.
* CKEditor: https://extensions.xwiki.org/xwiki/bin/view/Extension/CKEdit...
* Syntax: https://www.xwiki.org/xwiki/bin/view/Documentation/UserGuide...
Actually any wiki pages can define a Class and how to display this class and / or instances of a Class:
* https://www.xwiki.org/xwiki/bin/view/Documentation/DevGuide/...
* https://extensions.xwiki.org/xwiki/bin/view/Extension/App%20...
The Script Macro is useful to make some dashboards ( https://extensions.xwiki.org/xwiki/bin/view/Extension/Script... )
I've deployed this for the internal documentation inside a company I worked for (MediaWiki was a no-go even with a visual editor).
For each new feature, I was developing inside a clean new wiki, than I exported the changes once I was sure everything was okay. It is way more easy then to upgrade to the new XWiki version.
From the text of Outline's license [1]:
Notice
The Business Source License (this document, or the “License”) is not an Open
Source license. However, the Licensed Work will eventually be made available
under an Open Source License, as stated in this License.
I think it is more accurate to say that Outline will be open source, rather than Outline is Open Source.[1]: https://raw.githubusercontent.com/outline/outline/develop/LI...
Since I just got into MediaWiki and write my first extension (finally a dark-mode that works), I'll see if this can be implemented. Perhaps with https://www.mediawiki.org/wiki/API:Upload.
Mediawiki has some UX and RBAC challenges that makes it difficult to scale to large organizations.
Google has some motivations written down from their lawyer department: https://opensource.google/docs/using/agpl-policy/
It boils down to 'not worth the risk, do not use'.
Simply hosting it with your information in would not have any such effect, but many commercial entities avoid anything xGPL just-in-case. In this case perhaps because they see a time that they might later want to package and distribute documentation that is in the wiki without converting it to something else first.
There is an extra concern with AGPL that does not exist with GPL specifically because of its key difference. AGPL applies to hosting the software and making it available not just distributing a compiled form. Some interpret this as meaning that if it is hosted on the same server, or in the same site, as other software then that other software becomes AGPL licensed too. I doubt anyone would enforce this interpretation but the possibility is enough to put off those who create proprietary software.
> making it accessible to the public that the copyleft license would then apply to my proprietary software?
Not just the public. Anyone you give access to, so for non-public hosted proprietary software you could be beholden to giving them access to the code under the AGPL in situations where AGPL applies. This will be a complete blocked for many creators of proprietary, or other non-*GPL licensed, software.
[if the above makes me sound against AGPL rest assured that I am not - I in fact might end up using it at least initially (at least until I decide upon which of the more proprietary-friendly options to use) for some near-future projects]
Interestingly Amazon just finished a multi-year effort to migrate off MediaWiki internally to comply with an infosec mandate that PHP is banned company wide.
To actually set someone else's password to a specific value does require running a command line script (not the same as going into db). In my view that is a reasonable security-convinence trade off.
In any case, i would assume a large org would use a single-sign-on extension and not mediawiki's native user management, which would make MW's password management moot.
Disclaimer: am mediawiki developer
- easy to use for technical and non-technical staff alike: multiple editing options
- third party authentication: really comprehensive offering
- quality search: comprehensive internal and third party search offering
- ease of maintenance: largely everything is built-in, so no module/dependency maintenance headaches
- user management: solid user/group management system
With internal tools you need things to stick, and fast. As much as I am fond of mediawiki, the editing experience is a barrier to usage for many. And the extension ecosystem, while rich and diverse, is just more of a liability than a single installation. A quality search is also really important to adoption, so having options there is great.
I'd been using Docsify on a small scale with authentication through GitLab to edit, GitLab CD to build and Cloudflare Access to secure the front end. It works really well, but the lack of user management and the editing experience mean that it's time to move on.
It would be great to hear if this is a case of the grass always being greener on the other side.
Who exactly is asking for slower software?
I'm sorry but it's right there in the name
As much as we could argue about whether no-js support really matters or not in 2020, the fact remains that having to having to load Vue.js and have it parse and render the frontend on the client is not really "lightweight", especially when the most popular competing products pre-render on the backend.
It wouldn't honestly be that much of an issue if it was a SPA (and it must do SPA-ish things already if it uses Apollo) so you just load the frontend once and it'd load other pages asynchronously, but nope. Every link is a full page reload, with Vue having to re-do everything every time.
It just reeks of modern tech used in an old fashioned way, which ends up with the performance penalties from both.
Go browse any other wiki, see how much faster and smoother the experience is.
I would say using js without having a no-js version is ok, if done correctly.
Also noticed that it feels slow page to page, thinking that it might be an issue with my Firefox configuration I opened the page on a fresh profile and it's still slow, but if you open it in Chrome page to page becomes almost instant and is more comparable to MediaWiki. So maybe this particular performance issue on FF can be resolved but it does seem like a worse end user experience when compared to MediaWiki.
But yes, not loading without Javascript is a showstopper.
Is Node.js that blazing-fast ?
It is based on V8 which in many (caveat: far from all) benchmarks comes out as the fastest JS engine, so by that definition in the realm of JS powered components it is pretty quick.
node.js solutions will often perform better than common configurations of other options too - Apache+PHP to pick one example out of the air. Then again, depending on the code other configurations of PHP might outperform Node.
This gives us plan text files that are tracked in a repo. It uses the user as the author, so now I can "code review" edit's to our wiki.
The content of the wiki is easily cloned by cloning the git repo. It is markdown in folders so if wiki.js dies at some point I could write a pandoc script to turn it into web pages again, you do loose all of the cool UI features.
Now that git has become ubiquitous, I prefer git with a self-hosted git-daemon instance. git , grep , awk , and sqlite make a strong set of tools for knowledge curation.
edit: minor grammar fix
That said, DocuWiki is pretty decent to get up and running quickly.
[1] https://github.com/WP2Static/wp2static/pull/506
[2] https://stackoverflow.com/questions/55062897/decrypt-aes256-...
I understand that bad code can be expressed in every language. But there is bad tooling too.
PHP clearly has a lucrative place in the world. But it remains a significant threat vector.
Yes, even in 202x. I leave others to discuss why this is the case. I won't install PHP on a workstation just to run a Wiki. ^_^
With automation you'd build images based on their images but run via your own CI/CD with your own security scans and any additions you might need (like additional logging infrastructure). Doing that is not possible with AGPL.
I guess, to a certain extent, that's because I'm an individual, not a company, and one that tends to open source pretty much everything they write. This is the same licensing that I use for pretty much all my projects (AGPL with no CLA).
What are you talking about? They can change the license to a closed one from a certain version in the future.
You're right if and only if by "they" you mean every copyright holder whose contributions would exist in the future version (including, say, the contributions of the very person you're responding to). But if by "they" you mean the project leaders acting without the cooperation of everyone who holds copyright, then that's a no.
None of that is impossible with AGPL.
So what? If companies need a certain software, they can pay for it. I remember a time when FOSS was not about providing companies with free work, quite the opposite indeed.
This isn't about good/bad or something like that, just an odd presentation that doesn't seem to be in line with the license. There is nobody to pay here to use this stuff because you still won't be able to integrate it without also sharing internal IP.
There are plenty of organisations that would happily pay what they'd normally pay Atlassian to use Wiki.js but they can't because they don't want to share any of their own code. This is also why license guides like the one from google explicitly bans all AGPL software because it's not worth the risk.
It's a bit weird to comment on this as if it's an oversight or unintended downside. Suppose you keep going into someone's house and they don't want you to, so they do something to dissuade you (like putting locks on their doors). You then complain that you can't get in. Their likely response? "Well, yeah..."
I mean thats too large number. Is this of all open source software or I am misunderstanding something else?
It's just enough added structure and functionality to make the whole body of notes more useful, without having to learn a formal system or adopt someone else's idea of what my note hierarchy should look like.
You can host your own for free: https://github.com/outline/outline
Node.js 10.12 or later
MySQL, MariaDB, PostgreSQL, MSSQL or SQLite3
Is it possible to install and run all of these as a non-root user? docker run -d -p 8080:3000 --name wiki --restart unless-stopped -e "DB_TYPE=postgres" -e "DB_HOST=db" -e "DB_PORT=5432" -e "DB_USER=wikijs" -e "DB_PASS=wikijsrocks" -e "DB_NAME=wiki" requarks/wiki:2And choose SQLite.
Everyone should consider running a wiki locally just for yourself. It's like being able to organize your brain. I just got into it two days ago and basically spent the whole weekend dumping things into it in a way I can actually browse and revisit, like the short stories I'd written, spread out across Notes.app and random folders.
You don't need to run WAMP, MySQL, Apache, phpmyadmin or anything. Here are the steps for someone, like me, who hadn't checked in a while:
0. `$ brew install php` (or equiv for your OS)
1. Download the wiki folder and `cd` into it
2. `$ php -S localhost:3000`
3. Visit http://localhost:3000/install.php in your browser
I tried DokuWiki at first (has flat file db which is cool). It's simpler, but I ended up going with MediaWiki which is more powerful, and aside from Wikipedia using it, I noticed most big wikis I use also use it (https://en.uesp.net/wiki/Main_Page). MediaWiki lets you choose Sqlite as an option, so I have one big wiki/ folder sitting in my Dropbox folder symlinked into my iCloud folder and local fs.
Really changing my life right now. The problem with most apps is that they just become append-only dumping grounds where your only organizational power is to, what, create yet another tag?
My advice is to just look for the text files scattered around your computer and note-taking apps and move them into wiki pages. As you make progress, you will notice natural categories/namespaces emerging.
I just wish I started 10 years ago.
My point is to see how easy it is to set up (I used to always equate PHP with having to get a whole WAMP stack online) thus how easy it is to try for yourself.
or you can just use Zim which is a cross-platform desktop app which does not need any setup and simply save files as text files in markdown : https://zim-wiki.org
This is the rub. I started a tiddlywiki last year, and stuck with it for several months, but now it has fallen to the wayside as too cumbersome.
I've been in a number of firms with wiki knowledge systems. In 100% of the cases it was a wasteland of derelict knowledge that had been abandoned and was usually much more destructive than beneficial.
No one was going to undertake the process of keeping it up to date, and at the same time the emergent organization/structure of information was constantly evolving, and wikis are terrible at evolving with that unless you literally have people whose sole job is making templates deciding on the ontology, etc.
Similarly, countless people have tried to organize their lives into tools like wiki. And in the early days it seems magical. I suspect the failure rate would be somewhere barely under 100% at the one month mark.
It's like you're about to tell me that exercising doesn't pay off because it's hard to stick with a strategy. "Heh, let's see if he's still doing pushups in a year."
You don't seem to realize you're just describing literally all systems. How organized is everyone's filesystem and ~/Documents folder? It's pure chaos with the only sweet release being that you might not carry it over when you upgrade computers and get to start from scratch.
Will I be maintaining my localhost wiki in a year? I don't know. But it's worth a shot. After two days it's already 1000x organized than even my best efforts so far.
Is it for everyone? Nothing is for everyone.
But your comment seems to suggest that you think the alternative to <organization strategy> is organized data which obviously isn't the case.
What you will realize is that there is no perfect one-size-fits-all strategy. All you can do is try things and see if they work for you, and see if you stick with them years later.
So, for today, I recommend trying some localhost wiki options in your battle against chaos. If it doesn't work for you, so what?
I've thought about knowledge management a lot over the last 20 years, since I built a Wiki/bug tracker system (this was before anything except Bugzilla existed).
I think knowledge management systems can work if the "management" side is a side effect of their use.
Today though Google Docs, Keep, Notes, Github Gists, github itself, and many other places I can easily store notes and access from anywhere. No reason to setup a wiki and have to maintain it myself.
When you start to plan how to move all of your stuff under one umbrella, the solution starts to sound a lot more like a wiki on paper, I think. Even if you move all this stuff to your filesystem, I think you still need a layer over it to manager it all -- or at least I did.
Of course, it's not the only answer. And I admit I have been contributing to wikis like Wikipedia and UESP for a decade now and the jump to a personal wiki was a no brainer.
But I wonder, what solution would you consider for this "disjointed data" problem? Do you just not see it as a problem? One of the first things I did when I stood up a personal wiki was to log into ancient google accounts to exfiltrate ancient google docs that I'm glad I found again.
It just became an append-only log for me with very limited organizational power. Though I do like it for anything just long enough where a single .txt file doesn't cut it. Tiddly is great for that case because it encapsulates the common task of jumping between the same sections over and over -- the real downside of a large file. But you aren't alone in finding it's not so great on a larger scale.
So if you did like the idea of a wiki but weren't diddly with the Tiddly, might be worthwhile to check out something like DokuWiki or MediaWiki.
The reason I'm a convert is that it seems like the best of both worlds between raw note-taking and a wiki. The advantage over raw note-taking is the links that enable you to "crawl" its entirety. The advantage over a wiki is that it's tech-agnostic and you can do it however works best for you.
On the other hand, a wiki may be better for someone who wanted to embed media in their notes (such as audio recordings).
I created a mashup of Zettelkasten + bullet journaling + a linking system based on tagging and IDs that models the fact that knowledge is both hierarchical and associative - i.e. fractal.
My co-founder and I have been building a hosted version of this[1] for the last two years, because we recognized that while self-hosted wikis work great for techie people, there are a lot of other people who that label doesn't fit.
So we've been working to create a collaborative knowledge-base platform built around some key concepts:
1. Built around cards rather than documents, which allows for a lot of interesting and flexible features. Such as...
2. Granular sharing – on Supernotes, you can share an entire collection of cards, or you can share one card at a time. We also have recently introduced[2] a "friends" features that allows you to quickly drag-and-drop cards onto your friends to share with them.
3. Multi-parent nesting – there is no folder-style filesystem on Supernotes, we allow you to nest cards inside of each other. On top of this, we allow for this nesting to be multi-parent, so different users can fit the same cards into their own unique structure (effectively a collaborative / personalized version of symbolic links).
4. Public vs. private tags – cards can be tagged with public tags that everyone sees, but can also be tagged privately with only tags that you can see. This same idea is reflected across the platform, where we want the underlying content to be the same for everyone but want to allow users to personalize the metadata/structure to suit their own workflow.
5. Focus on speed – we have spent a lot of time making Supernotes speedy quick, and try to make it faster every time we release a new feature.
Anyway that is the rough idea. The goal of Supernotes is to be a sort of data-layer where you can keep all these compartmentalized pieces of content (as cards) and then mix-and-match at will to create very simple or very complex stores of knowledge. We also want you to be able to embed these pieces of content elsewhere (say in a Notion document or on your blog) with as little effort as possible (not quite there yet, but will be soon).
I did scan of the faq and ended up up on the docs and searched for export. I was pleasantly impressed with the entry which showed an export option along with text and videos showing how to do this.
For me, I probably won't be spending 300 on this when I can wiki or wordpress for free... but if I was not so jaded about saas and cloud, I would be persuaded to check out your thing if you had on the front page like "export, backup" and bonus if it was 'import/export markdown or similar files'
I'd feel less worried about vendor lockin, holding data hostage, what happens when you go bankrupt, etc
the heading fonts on your privacy page are a little wonky in my browser (firefox) - being that you are uk and sharing data with EU and outside the EU - I'd only save info if it was encrypted.. not sure if that is a thing, if so I would make 'privacy built in' a big thing on the front page.
my cents in trying to help, I'm sure 98% of those who may use your service are not as sensitive to the same things I am - so this is not a critique saying it's bad, just offering some random thoughts as I took a look.
Data ownership is pretty important to us, even though we are only offering a hosted solution, which is why we explicitly say as much in our T&Cs[1]. But yep, we want to make export / backup of your data as easy as humanly possible. The hard part generally is that there are a number of features that exist on Supernotes which just don't exist elsewhere, so even when you use the export feature it is hard to guarantee we can export it in a format that is useful to you.
That is part of the reason we are doing our best to openly document our API[2] so that you can interact with your own content in whatever way you wish (including importing content from wherever or exporting to wherever). Obviously this requires some coding, but we're hoping the community[3] will share any tools they build on top of the API with each other.
Unfortunately E2EE is not quite there yet, as it makes it much more difficult to facilitate sharing when you have E2EE, as well it being a bit of a problem when it comes to a knowledge base if a user loses their private keys and you have to tell them "sorry we can't get your content back – it's all gone". But this is definitely something we are working towards – just takes some time to nail the UX. Since we are definitely never going to sell your data or anything (as per T&Cs), it's better for us if it's E2EE as then it's just one less liability for us from a data protection perspective.
[1] https://supernotes.app/terms/
EDIT: VisualEditor, the de facto standard for pasting things like screenshots into your articles seems to be a pain to install. Got my local env up and running though.. Will report back on success with this extension.
Given the complexity of setting up WikiMedia properly, I think I'm going to keep using Obsidian.
If I ever tire of keeping a personal wiki for whatever reason, all of the content I've built up in it will remain organized as files within directories.
I highly recommend the 'Backlinks' plugin to improve the wiki functionality; leaves Roam standing in the dust for personal use.
However, backlinks are not possible without hacks. A wiki without backlinks is kind of lame and I could very well use my good old plain text files.
Have you run in to trouble when updating MediaWiki, or is it smooth sailing? SQLite is not mentioned here: https://www.mediawiki.org/wiki/Download
- future proof (at least not only a one man project) - Fast search over all informations - Fast creation of quick notes (inbox) - Mobile iOS client
Currently I am stuck with Notion, which has a great 'database' concept. Which is fun to use. Sadly it's too slow. If I want to take a quick note on the go "Google for M6x40 Screws" I need 10-20 seconds with Notion.
I don't even mind paying for such service...
I wish proper wikis hadn't gone by the wayside. (I think it has a lot to do with MediaWiki's default skin being out of style, and people not realizing they can change it.) Most of all, I wish open source projects would stop dumping a bunch of Markdown in a repo somewhere and calling it a "wiki". They're not even close to comparable.
My two biggest complaints about MediawWiki are 1. PHP, and 2. no well-supported way to opt-in to a different syntax like Markdown or AsciiDoc or pretty much anything that isn't MediaWiki-flavored wikitext.
My steps work for all 5 of the wikis I tried before I settled on MediaWiki (though I don't necessarily recommend it to everyone). The install.php script might be in some subfolder, but the website instructions will tell you.
Neither DokuWiki nor MediaWiki (via sqlite) needed to have an external DB running, though some wikis do depend on MySQL.
It was just a quick summary to show how easy it is. e.g. PHP has an embedded server these days.
<a href="words">[words]</a>
meaning out-of the box support for arbitrary hrefs: [/absolute_wiki_links]
[relative_wiki_links]
[https://external.links]
[mailto:email@adress.es]
and more!If using a web app, it would be better to run it on a $5 server, so if you want to type in something while you're outside with just your phone, you can do that also.
I have thousands of notes in Notes.app across every subject. And moving them into my wiki (categorizing them, linking them) was one of the first thing I did. And one of the best things I've done. Like I had all sorts of stuff in there: stories I've written, lists of things, texts me father sent me, 4 different documents where I had written down birthday/xmas ideas for my girlfriend that I never remembered to check.
These mapped very nicely to pages and categories on my wiki. I even have a page for my girlfriend (globally available on my sidebar) that now has a === Gift ideas === subheader.
One day you just might decide Notes.app is not cutting it for you and that you want better organization. Maybe you won't. I'm in my 30s and didn't do it til now.
I have mine running from Dropbox, so my other computers always have it synced. The real issue is mobile access. It's not something I care about right now but making it internet addressable is certainly something I could do in the future.
We run this in a docker container with SQLite database and backup the database daily to another server.
The private and public pages feature fits perfectly to our use case. We show system information, how-to guides and rules on the public pages and manage sysadmin documentation with restricted access.
The main thing I'm worried about with other wiki software (including Wiki.js) is that if it's compatible with gadgets, userscripts and all of the other neat tools already available.
It doesn't have to be MediaWiki, or even a distant relative of it. It just has to work with them.
I will be happy if this Wiki.js platform does have compatibility with these features, though.
It is unclear to ne if you refer to statistics or gut feeling here. Would you mind clarifying?
Oh wait yeah.