A meta-communities wiki tool — a living knowledge commons where any kind of data is imported, curated collaboratively, and exported as interoperable files for tools like Tana, Logseq, and beyond. Wikipedia meets a living forest.
Global Wiki Tree is a meta-communities wiki tool that reimagines how knowledge is collectively owned, curated, and distributed. Where Wikipedia created a single encyclopedia, Global Wiki Tree creates a living forest of interconnected knowledge bases — each community tending its own branch while sharing roots with the whole.
The core premise is simple: communities and organizations already produce vast amounts of structured knowledge, but it remains trapped in silos. Global Wiki Tree gives these communities a shared trunk — a common data layer with strong confidence defaults — while letting each branch define its own curation standards, taxonomies, and export formats.
Think of it as the knowledge commons that the internet was always supposed to become: not a single platform that owns everything, but a living tree where every node is sovereign, every connection is meaningful, and the whole grows stronger as each part grows.
Two forces are converging. First, knowledge management remains fragmented — your notes, your team's wiki, your community's documentation all live in separate tools that barely talk to each other. Second, the rise of AI has made trustworthy, curated data more valuable than ever. Models trained on messy, unverified web scrapes produce messy, unverified outputs. Global Wiki Tree addresses both: structured knowledge that flows freely between tools, with provenance and confidence baked in.
Global Wiki Tree follows a three-phase flow: any kind of data enters the system, gets shaped through community curation, and exits as clean, portable files ready for whatever tool you prefer.
Any kind of data enters the tree — structured, semi-structured, or raw. APIs, file uploads, scrapers.
Community editors shape, verify, and connect knowledge. Easiest path: curation by domain experts.
Clean, interoperable files for Tana, Logseq, Obsidian, or any tool. Also available via API.
Not all knowledge carries the same weight. Global Wiki Tree defaults to showing only strong-confidence content — claims that have been verified by multiple curators or backed by cited sources. This is the public face: what any visitor sees.
For organizations and communities making offers, the confidence threshold is adjustable. An internal research team might want to see everything, including tentative connections and unverified imports. A public knowledge base for a city government might want only the most rigorously sourced material. The slider is theirs to set.
Global Wiki Tree exposes a full API for programmatic access. Import data from any source, query the knowledge graph, export curated subsets, and integrate with existing workflows. The API is the backbone that makes the tree a living system rather than a static archive.
Global Wiki Tree sustains itself through the organizations and communities that use it to publish and manage knowledge — not by extracting value from individual users. The commons stays free; the tooling for professional use carries the cost.
Communities that want to publish structured knowledge get the full platform for free or near-free. The tree grows by having more public data from more communities.
Organizations that use the tree for internal knowledge management or to publish professional-grade datasets pay for advanced features.
How to get lots of public data? By serving those who already want to publish — communities, NGOs, open-knowledge organizations, civic institutions. They bring the data; the tree gives it structure, discoverability, and interoperability.
Global Wiki Tree is not just a product — it needs to become a tech movement, similar to how Wikipedia became a movement before it became a platform. The technology is the easy part. The hard part is building the culture of collaborative knowledge curation at scale.
To reach critical mass, the project requires six key players who together form the initial root system of the tree.
System designer who understands knowledge graphs, ontologies, and interoperability at scale.
Someone who can rally early-adopter communities and build the culture of open curation.
Expert in data quality, provenance tracking, and confidence scoring systems.
Developer focused on import/export pipelines, API design, and integration with existing tools.
Public voice who can articulate why knowledge commons matter and attract institutional partners.
Business strategist who designs the model that keeps the commons funded without compromising it.
Current status: The vision is clear, the need is real, but the movement has not yet formed. Like Wikipedia in 2000, we need the right combination of idealism, technical talent, and timing. The seed is planted — it waits for its six roots.
Global Wiki Tree does not exist in isolation. It draws from and feeds into several other holons within the EvoBioSys ecosystem, forming a knowledge layer that connects tools, communities, and data.
Interoperating knowledge silos — the local-first note-operating system that Global Wiki Tree can export to and import from.
›The garden of side-projects where Global Wiki Tree was planted as a seed and nurtured into its current form.
›Global Wiki Tree builds on semantic web principles — linked data, structured ontologies, machine-readable knowledge.
›Global Wiki Tree stands on the shoulders of tools and communities that have been exploring networked knowledge for years. These are the roots we draw from.