This is my current mental picture of the architecture of the parts included in the RDF import/export functionality I'm implementing for Semantic MediaWiki as part of my Google summer of code project. I just got the ARC2 based store functional. The functionality still to be implemented in "dashed" lines:
- - - - - - - - - - | Export | | Import | - - - - - - - - - - ^ | | v - - - - - - - - - - --------------- | Equiv URI handler |->| SMW Writer | - - - - - - - - - - --------------- ^ | | v --------------------- --------------- | SPARQL+ Interface | | SMW | --------------------- --------------- ^ ______/ | | v v ----------------- ---------------- | ARC2 Store | | MediaWiki DB | ----------------- ----------------
Now I have a working RDF Store connector for Semantic MediaWiki, that uses ARC2:s RDF store, rather than SMW:s built-in store. This will allow to take advantage of functionality in ARC2, such as possibility to set up a SPARQL endpoint etc.
Thanks to Alfredas Chmieliauskas for the Joseki store connector in the SparqlExtension for SMW, which this connector is heavily based upon.
The ARC2 connector implements the same amount of the SMWStore API as the JosekiStore, but I'm not yet sure if more needs to be implemented, for the things we want to do (general RDF import/export). Gotta figure that out.
The code is available in the google code repository trunk, and install instructions on the gcode wiki.
Feel free to try it out, but be warned that it has been only very briefly much tested at all yet!
Back on track GSoC:ing. Follow progress at my twitter.
I presented my MSc thesis project "SWI-Prolog as a Semantic Web tool for semantic querying in Bioclipse" today. (Report for download here
Find the slides below. I expected a very non-informatics audience (though some of my fellow Bioclipse:rs showed up =) ), had only 20 minutes, and lots of non-common-knowledge things to introduce, so these are really mostly a bunch of pictures for talking through the basics of semantic web, prolog and Bioclipse.
Turning back to the GSoC project now!
Have started actual coding for GSoC this wednesday (start was 2 weeks delayed because of exams, which I'll catch up). Still just getting up to speed, but looking now into the PHP RDF framework ARC, whose RDF store will replace the currently used RAP store in Semantic Mediawiki. Usage ARC itself looks very straightforward. Just have to figure out the SMW Store API. Looking at the SMWRapStore2.php now, to get an idea.
If you want to follow my progress in (approximate) real-time, then see my twitter.
My degree project, titled "SWI-Prolog as a Semantic Web tool for semantic querying in Bioclipse" is getting closer to finish. Now my report is approved by the Scientific Reviewer (thanks, Prof. Mats Gustafsson), so I wanted to make it available here (Download PDF). Reports on typos are welcome of course! :)
Coding for my GSoC project will start for real in June, 9th or so, but I just had a first look at code, to start wrapping my head around the things involved. I installed the following on my local SMW:
I tested the RAP based SPARQL endpoint, played a bit with SMWWriter, and tried to get some grips of how to best use existing functionality for implementing RDF import/export.
Some questions that arose (for Denny in the first place, I guess, but feel free to comment):
We are probably going to use SMWWriter for extending the RDF import/export functionality of Semantic MediaWiki, so I wanted to test it out a bit.
With some copy and paste of code from this page, I quickly had a MediaWiki Special Page set up, where I could make use SMWWriters internal API to implement a crude form for adding or removing "triples" in my Semantic MediaWiki. See Screenshot:
And the result, on the Methane page:
Looks promising. Connecting this with some ARC functionality for parsing SPARQL and RDF/XML, should make a big step in the right direction.
(For my internal documentation, mostly)