My name is Alexandru Sabo, and I am a freelance digital illustrator and 2D/concept artist. In the last 15 years I worked both as an employee and a freelancer for different companies, and various clients at the same time. A few to mention: Highlander Studio, Crytek, Fantasy Flight Games and recently start to work for Paizo Publishing.Do you paint professionally, as a hobby artist, or both?
Professionally.What genre(s) do you work in?
Mostly fantasy.Whose work inspires you most — who are your role models as an artist?
When I started with the fantasy, I learned a lot from Warhammer Army Books. That means Adrian Smith, and after that I was corrupted further by Jim Murray’s work. It was closer to me (more comics/conceptual style).When did you try digital painting for the first time?
In 2001, with my first children’s book.What makes you choose digital over traditional painting?
I don’t know if it’s a matter of choice. From 2000 on I worked in graphic design for ten years, and when I encountered my first Wacom tablet in 2001, it was a natural thing to use with Photoshop. But I still believe in traditional painting: my sketches or drawings are still made mostly with traditional techniques.How did you find out about Krita?
In April 2014 I was thinking about starting a new small illustration and graphic art studio. I was looking for an option to have between 2-4 computers with open source stuff (low cost). Gimp was not an option, and after I searched more I found David Revoy’s article “My hardware and software for digital painting”. Then I realised that Krita was the first viable option for me. A main thing was also that Krita could manage CMYK files, a big issue for an illustrator who works for print!What was your first impression?
Awesome ;). It was very simple and friendly to use. Its GUI and workflow were very close to Photoshop, which was something important to me at that moment.What do you love about Krita?
Hmm… funny question, because I do not love software . As a professional computer-graphics artist it’s more a matter of “can this software satisfy my needs or not?” To mention some very useful and maybe unique features: the easy and very fast way to make a mirror view, and the simple move tool as brush (this is kind of liquify tool/filter in Photoshop). But overall, it is so natural to start to work and use Krita.What do you think needs improvement in Krita? Is there anything that really annoys you?
To be honest, the stability –it is the most important thing for me– and managing very high resolution files. Also improvements in the layer/layer group managing system, but I understand that there are already plans to fix those soon.What sets Krita apart from the other tools that you use?
Free! For freelancers, this can be a blessing, even if the best OS to run it on is Linux so far. And as I mentioned before, it’s natural to work with and very user friendly.If you had to pick one favourite of all your work done in Krita so far, what would it be, and why?
I can’t tell you right now, because it’s still under NDA! But I like the illustration with Lini — iconic character from the Pathfinder roleplaying world.What techniques and brushes did you use in it?
My technique is very simple and common, basic brushes and a few extra, nothing that can’t be replicated in other software. And other common layer filters also. Actually nothing special.Where can people see more of your work?
My best up-to-date place is my official site with a blog section as well, where I post my latest works: http://alexandrusabo.ro/Anything else you’d like to share?
Maybe the new Patreon campaign I started a few weeks ago, where I will do all my personal projects, and try to keep fans or people who like my work in the same place. Of course there are some possibilities to support me to do more…
I try to offer some nice rewards there as well, so everybody is welcome!
In my last blog entry, I mentioned that we have been working on a comprehensive data loss prevention (DLP) and audit trail system for use with Kolab, with the end goal being not only DLP but also a platform for business intelligence. In that entry I listed the three parts of the system, noting that I'd be writing about one at a time. I had hoped to jump on the first of those a day or two after writing the entry, but life and work intervened and then I was off on a short family vacation ... but now I'm back. So let's talk about the capture side of the system.
Kolab can be viewed as a set of cooperative microservices: smtp, imap, LDAP, spam/virus protection, invitation auto-processing, web UI, etc. etc. There are a couple dozen of these and up until now they have all done the traditional, and correct, thing of logging events to a system log.
This has numerous drawbacks, however. First, on a distributed system where different services are running on different hosts (physical or VMs), the result is data spread over many systems. Not great for subsequent reporting. At the time of logging, the events are in a "raw" state: each service likely does not know about the rest of the Kolab services and thus how their events relate to the whole system. With logs going through the host systems it makes it difficult to ensure that they are not easily tampered with; this can be somewhat alleviated by setting up remote logging but this also only goes so far. Finally, logging tends to be a firehose of data and for our specific interests here we want a very specific sub-stream of that total flow.
So we have written yet another service whose entire job is to collect events as they are generated. This service is itself distributed, allowing collection agents to be run across a cluster running a Kolab instance, and it stores its data in a dedicated key-value store which can be housed on an isolated (and specially secured, if desire) system. The program running this service is called Egara, which is Sumerian for "storehouse", and it is written in Erlang due to its robustness (this service must simply never go down), scalability and distributed communication features. The source repository can be found here. Egara itself is part of the overall DLP/auditing system we have named Bonnie.
The high-level purpose of Egera is to create a consistent and complete history of what happens to objects within the groupware system over time. An "object" might be an email, a user account, a calendar event, a tag, a note, a todo item, etc. An event (or "what happens") including things such as new objects, deletions, setting of flags or tags, changing the state (e.g. from unread to read), starting or tearing down an authenticated session, etc. In other words, its job is to create, in real-time, a complete history of who did what when. As such I've come to view it as an automated historian for your world of groupware.
Egara itself is divided into three core parts:
- incoming handlers: these components implement a standard behavior and are responsible for collecting events from a specific service (e.g. cyrus-imap) and relaying them to the core application once received
- event normalizers: these workers process events from the new event queue and are tasked with normalizing and augmenting the data within them, creating complete point-in-time additions to the history. Many events come in with simple references to other objects, such as a mail folder; the event normalization workers need to turn those implicit bits of information into explicit links that can be reliably followed over time
- middleware: these are mainly the bits that provide process supervision, populate and manage the shared queues of events as information arrives from incoming handlers and is processed by normalizers.
This all happen asynchronously and provides guarantees at each step of correct handling (inasumuch as each reporting service allows for that). This means that individual normalizers can fail in even spectacular fashion and not disrupt the system, that an admin can halt and restart the system at will without fear of loss of events (save those that are generated during downtime periods, assuming a full Egara take-down), etc.
Final storage is done in a Riak database, with queues managed by the Mnesia database built into Erlang's OTP system itself. Mnesia can best be thought of as a built-in Redis: entirely in-memory (fast) with disk backing (robust); just add built-in clustering and native, first-class API for storage and retrieval (e.g. we are able to use Erlang functions to do perform updates and filtering over all or part of a queue's dataset). Data in Mnesia is stored as native Erlang records, while data in Riak is stored as JSON documents.
Incoming events may be any format and any delivery mechanism. They can be parallelized, spread across a cluster of machines ... it doesn't matter. The incoming handler is tasked with translating the stream of events into an Erlang term that can be passed on to the normalizer for processing. This allows us to extend Egara in a very easy way with new service-specific handlers to virtually any dataset we wish to keep track of within Kolab or its surroundings.
Normalizers will eventually also join this level of abstraction, though right now the sole worker implementation is specific to groupware data objects. Future releases of Egara will add support for different workers for different classes of events, giving a nice symmetry with the incoming event handlers.
The middleware is designed to be used without modification as the system grows in capability while being scalable. Multiple instances can be run across different systems and the results should (eventually) be the same. I say "eventually" since in such a system one can not guarantee the exact order of events, only the exact results after some period of time. Or, in more familiar terms, it is eventually consistent.
The whole system is quite flexible at runtime, as well. One can configure which kinds of events one cares to track; which data payloads (if any) to archive; which incoming handlers to run on a given node, etc. This will expand over time as well to allow normalizers and their helpers to be quarantined to specific systems within a cluster.
Egara works nicely with Kolab 3.4 and Kolab Enterprise 14, though Bonnie is not officially a part of either. I expect the entire system will be folded into a future Kolab release to ease usage. It will almost certainly remain an optional component, however: not everyone needs these features, and if you don't then there's no reason to pay the price of the runtime overhead and maintenance.
That's a "50,000 foot" view of the historian component of Bonnie. The next installments in this blog series will look a bit closer at the storage model, history querying and replayability and, finally, what this means for end-users and organizations running Kolab with the Bonnie suite.
Packages for the release of KDE's desktop suite Plasma 5.3 beta are available for Kubuntu Vivid. You can get it from the Kubuntu Beta PPA.
KDE's first release of its 15.04 series of Applications and Frameworks 5.9.0 are now available to all Chakra users. With this release kde-workspace has also been updated to version 4.11.18 and kdelibs to 4.14.7. Have in mind that the applications that have been ported to Frameworks 5 will not be updated but remain at their previous versions, as they are being prepared to be included in the upcoming Plasma5 switch.
According to the official announcement, starting with this release KDE Telepathy and kdenlive will be shipped together with the rest of KDE Applications.
In addition, the following notable updates are now available:
- linux 3.19.4
- nvidia 346.59
- git 2.3.5
- vlc 2.2.1
- wine 1.7.41
- ruby 2.2.1
- digikam 4.9.0
- apache 2.4.12
- subversion 1.8.13
- bomi (a Qt5 GUI player based on mpv) 0.9.7
- otf-source-han-sans (CJK fonts) 1.002
It should be safe to answer yes to any replacement question by Pacman. If in doubt or if you face another issue - please ask or report it on the related forum section.
As always, make sure your mirror is fully synced (at least for the core, desktop and platform repositories) before performing this update by running the mirror-check application.
My dear readers, I have agreed with ArangoDB to help them spread the word about their – of course open source – multi-model NoSQL database, and I will be using this blog to do that. It’s not going to be boring marketing bla bla, but I’m planning to write about things like tutorials, interviews with ArangoDB users and contributors, as well as the adventures of a “kind-of-geeky psychologist” (i.e. me) taking a stab at developing a data-intensive web application using ArangoDB.
PlanetKDE is not subscribed to this whole blog, however, but only to a specific category. Therefore, I can control whether these posts show up on the Planet or not. Since many of my readers here are rather on the tech-savvy side of the spectrum, I suppose that these posts might be of interest to at least some, if not many, of you, but I don’t want to “spam” the Planet with these posts if the majority would not be interested in them.
Therefore, I want to give you, my readers, a choice. Please vote using the poll below on whether you’d like to see NoSQL database-related posts show up on the Planet or not. Of course I can revise the decision later if I get a lot of feedback to the contrary on my individual posts, but I’d like to get a quantitative picture of the general preference beforehand.Take Our Poll
Filed under: KDE
Paul and Lydia have blogged about how KDE should and could evolve. KDE as a whole is a big, diverse, sprawling thing. It's a house of many rooms, built on the idea that free software is important. By many, KDE is still seen as being in competition with Gnome, but Gnome still focuses on creating a desktop environment with supporting applications.
KDE has a desktop project, and has projects for supporting applications, but also projects for education, projects for providing useful libraries to other applications and projects to provide tools for creative professionals and much, much more. For over a decade, as we've tried to provide an alternative to proprietary systems and applications, KDE has grown and grown. I wouldn't be able, anymore, to characterize KDE in any sort of unified way. Well, maybe "like Apache, but for end-users, not developers."
So I can only really speak about my own project and how it has evolved. Krita, unlike a project like Blender, started out to provide a free software alternative to a proprietary solution that was integrated with the KDE desktop and meant to be used by people for whom having free software was the most important thing. Blender started out to become the tool of choice for professionals, no matter what, and was open sourced later on. It's an important distinction.
Krita's evolution has gone from being a weaker, but free-as-in-freedom alternative to a proprietary application to an application that aspires to be the tool of choice, even for people who don't give a fig about free software. Even for people who feel that free software must be inferior because it's free software. When one artist says to another at, for instance, Spectrum "What, you're not using Krita? You're crazy!", we'll have succeeded.
That is a much harder goal than we originally had, because our audience ceases to be in the same subculture that we are. They are no longer forgiving because they're free software enthusiasts and we're free software enthusiasts who try really hard, they're not even much forgiving because they get the tool gratis.
But when the question is: what should a KDE project evolve into, my answer would always be: stop being a free software alternative, start becoming a competitor, no matter what, no matter where. For the hard of reading: that doesn't mean that a KDE project should stop being free-as-in-freedom software, it means that we should aim really high. Users should select a KDE application over others because it gives a better experience, makes them more productive, makes them feel smart for having chosen the obviously superior solution.
And that's where the blog Paul linked to comes in. We will need a change in mentality if we want to become a provider of the software-of-choice in the categories where we compete.
It means getting rid of the "you got it for free, if you don't like it, fuck off or send a patch" mentality. We'd all love to believe that nobody thinks like that anymore in KDE, but that's not true.
I know, because that's something I experienced in the reactions to my previous blog. One of the reactions I got a couple of times was "if you've got so much trouble porting, why are you porting? If Qt4 and KDE 4 work for you, why don't you stay with it?" I was so naive, I took the question seriously.
Of course Krita needs to be ported to Qt5 and Kf5. That's what Qt5 and Kf5 are for. If those libraries are not suitable for an application like Krita, those libraries have failed in their purpose and have no reason for existence. Just like Krita has no reason for existence if people can't paint with it. And of course I wasn't claiming in my blog that Qt5 and Kf5 were not suitable: I was claiming that the process of porting was made unnecessarily difficult by bad documentation, by gratuitous API changes in some places and in other places by a disregard for the amount of work a notional library or build-system 'clean-up' causes for complex real-world projects.
It took me days to realize that asking me "why port at all" is in essence nothing but telling me "if you don't like it, fuck off or send a patch". I am pretty sure that some of the people who asked me that question didn't realize that either -- but that doesn't make it any better. It's, in a way, worse: we're sending fuck-off messages without realizing it!
Well, you can't write software that users love if you tell them to fuck off when they have a problem.
If KDE wants to evolve, wants to stay relevant, wants to compete, not just with other free software projects that provide equivalents to what KDE offers, that mentality needs to go. Either we're writing software for the fun of it, or we're writing software that we want people to choose to use (and I've got another post coming up elaborating on that distinction).
And if KDE wants to be relevant in five years, just writing software for the fun of it isn't going to cut it.
While we continue to work on bugs for the next release (2.9.3), we have also been planning and working on the next Kickstarter!
We have been gathering your feedback across the forum, social media, and our chat room (IRC). We want to make the next feature release (3.1, planned for end of this year) the best possible. We are planning on launching the next Kickstarter on May 4. Two weeks! We have two big projects in mind – as well as some exciting stretch goals. The first project is performance improvements. This includes speeding up the application and painting with seriously large brushes. Creating and working with large canvas sizes will be much more responsive.
The second big goal is adding an animation system. This will help artists create sprite sheets for their game jams, animatics for story boarding, and potentially even produce an entire animated film! While this new system won’t be as feature rich as dedicated animation software, it will be substantially more powerful than Photoshop’s animation tools. It will include things like onion skinning and tweenable properties. We will provide more details in the coming weeks.
Our target goal for this Kickstarter is going to be €20,000 (about $21,000). With everyone’s help, we think this is attainable. Like the last Kickstarter, the money will cover a developer’s salary. Every €1,500 (about $1,600) we go over the goal, we will add a stretch goal.
Some of the stretch goals will be further animation features, others will be workflow improvements, new features for the brushes and more. There are too many stretch goals to list here! Like the last Kickstarter, what gets included will be voted on by the kickstarter backers.
We will let you know when the Kickstarter is launched to get all of the details. You can always sign up for the mailing list (at the bottom of this post) to stay up to date with all the news.Layer Styles Update
Adding layer styles to Krita is a really BIG task. It turned out to be much more work than we planned for. This is why it hasn’t made it into the Krita 2.9 release yet. There is still work to be done, but we really want to get this into your hands for you to start playing around with. Starting with the next release (2.9.3), we will be including layer styles into Krita. While there are some features that we are still working on, we want you to play around with what we have. We will be continuing to work on it, so rest assured that the bugs and kinks will be ironed out in the future. Until then, check out this teaser video Wolthera made:
If you consider yourself as a serious developer, you know writing good commit messages is important. You don't want to be that guy:
This applies to source comments as well: good comments save time, bad comments can be worse than no comments.
For a long time, I usually favored source comments over commit messages: whenever I was about to commit a change which needed some explanations, I would often start to write a long commit message, then pause, go back to the code, write my long explanation as a comment and then commit the changes with a short message. After all, we are told we should not repeat ourselves.
Recently I was listening to Thom Parkin talking about rebasing on Git Minutes #33 (Git Minutes is a great podcast BTW, highly recommended) and he said this: "Commits tell a story". That made me realize one thing: we developers read code a lot, but we also read a lot of commit histories, either when tracking a bug or when reviewing a patchset. Reading code and reading history can be perceived as two different views of a project, and we should strive to make sure both views are readable. Our readers (which often are our future selves...) will thank us. It may require duplicating information from time to time, but that is a reasonable trade-off in my opinion.
So, "Write extensive source comments or extensive commit messages?" I'd say: "Do both".
First of all, the Ardour people have written a building page and a list of dependencies. The do carry a set of patches towards some of the packages. These seems to be more or less small fixes, apart from the libsndfile that has a bug fix for handling BWF files.
In addition to the patches libs, the requirements list a whole range of gtk and corresponding -mm packages as well as boost, and varous codecs and such. I decided not to care too much about versions for these packages. Instead, I just took whatever I could find in Debian. The packages installed are:
Then it is just a matter of configuring using waf.
./waf configure --with-backend=alsa --prefix=/wherever/you/want/it
My plan is to use ALSA (i.e. not JACK) and installing libjack-dev meant that Skype got kicked out, so the system needed some love to restore the order.
apt-get remove libjack-dev
apt-get remove libjack0
dpkg --install skype-debian_184.108.40.206-1_i386.deb
apt-get install -f
Despite this little hack, Ardour seems to work nicely and record and play back. I still need to test out some more features to see if everything is in place, but it looks hopeful.
The board of KDE eV has launched a new initiative to ensure that KDE remains awesome and relevant for the foreseeable future. Unlike previous approaches it is not a point-in-time solution, it is a continuous process of improvement. And it is a good thing. Previously, I have written/spoken a lot about the role of Brooks’ Law in the context of… Read more →
So I finally have something blog worthy after a long time or at least writing this post will be a decent way to make it through the last 5 hours of journey back from Kerala. I feel a bit of context is required here, a while back I had submitted a talk proposal for conf.kde.in 2015, which, to my delight, was accepted. This years conference was held at Amritapuri, thanks to the insane amounts of effort put in by the FOSS club at Amrita Vishwa Vidyapeetham University, it went on smoothly. A special shout out to R.Harish Navnit. This was my first KDE conference and I must say it was amazing. I met awesome KDE India people, learnt a lot from the other talks and most importantly made new friends who love KDE.
Enough small talk talk a bit about the conference now. The conference opened with our keynote speaker Noufal Ibrahim, Founder of PyCon India, giving a demo of combining command line utilities to create a summary of Moby Dick from the book. Straight to demonstrations no boring stuff, that's how we roll. Looking for clues like good old
Noufal, even though not a KDE user, did an amazing job of showing how powerful, small & reusable utilities can be, when combined creatively.
Then Pradeepto and Shantanu took the stage to tell the students about what KDE is, being involved with KDE for ages now, these guys are obviously the best people for the job. They demonstrated lots of KDE software, told the students about the KDE community and motivated them to contribute. This was followed by Somsubhra's talk on Krita in which he demonstrated it's power with the help of tons of videos.
Then after a short break for lunch came the moment I'd been simultaneously dreading and looking forward to, my talk. As I already stated, this was my first conf.kde.in, it was also my first talk at such huge event. I was pretty anxious about it, but encouragement from the other the speakers specially Devaja Shah helped me calm my nerves, and I went on to give a decent talk which I hope motivated the students there to contribute to KDE. In my talk I shared my experience with the KDE community, told the students about my SoK project with Baloo, getting started with code contributions complete with demos of IRC, fetching, building, changing code and generating and submiting patches. One major aspect of my talk was to get the students to start using KDE and improving what they feel needs improving, scratching their own itch.
My talk, was followed by a hands on session by Shantanu for QML. It was amazing to see students reading around the documentation and experimenting to do stuff they wanted to and not just sticking to what they were being taught. He started with basics and ended with animations. I have to say Shantanu is a good teacher, even I learnt a couple of things from the session. The first day concluded with students interacting with the speakers asking their queries.
Sadly I missed pre-lunch talks the second day by Sanjiban, Sinny, Jigar and Rishab as I was helping out Shantanu in hands on QML sessions for students who'd missed the opportunity on the first day. But I'm sure they did an amazing job looking at the students' enthusiasm. After lunch, Devaja Shah took the students on a journey through the KDE galaxy giving a tour of lot of planets (read KDE projects).This followed by her presenting other ways of contributing to KDE apart from coding. This included participating in the Promo team, writing dot stories and helping in localization of KDE software.
The last two talks were by Ashish Madeti who gave impressive demo's of his GSoC project with PMC, by playing some awesome music on Plasma Media center using MPRIS and Karan Luthra who gave an amazing presentation on Trojitá the IMAP e-mail client. His talk helped the students in understanding concepts of IMAP, what Trojitá exactly is and how they can contribute to it. All in all it was an amazing experience with enthusiastic students eager to learn new things. I hope we get tons of new contributers. I finally met Pradeepto Bhattacharya, the founder of KDE India, talked a lot about KDE and random stuff. It was pretty amazing meeting other KDE lovers. This was my indeed my initiation to the KDE India community and I hope to be there at future events we hold.
A few photographs I took:
Now to the fun part, exploring Amritapuri. After the conference we went on to watch the sunset from 18th floor of a building along the coast. Amazing thing to witness. Amritapuri is a beautiful place situated along the western coastline of India. Finally saw the renowned Kerala backwaters from a bird's eye view.
One of the things I really did not like about our todo.kde.org service is that in order to use the API to get your tasks and projects, you need to have administrator privileges. Therefore, no sane tool actually supports it.
Phabricator is much nicer in that regard. It has a nice API, and it even has a Python module which can be used to retrieve anything that you’d want it to.
The first thing I wanted is to pull the issues and review requests from Phabricator into the TaskWarrior (if you still do not know what TW is, you ought to investigate it right this instant).
There is an external tool called BugWarrior that is able to get the tasks from a range of different services like github, jira, bugzilla (does not work with bugs.kde.org) and others. Fortunately, it also supports Phabricator.Loading Phabricator tasks into the TaskWarrior
When I became president of KDE e.V. in September last year I made a list of important things we need to do over the coming year. The biggest item on this list was helping KDE and KDE e.V. get a better understanding of where we are, where we want to go and how we want to get there. Today, with the help of many others, I want to start this process and I want you to be a part of it.
KDE began its life as a desktop project and Qt showcase back in 1996. Since then KDE has evolved to become something more significant; the modern KDE is a global community of technologists, designers, writers and advocates producing some of the world’s finest user-centric Free Software. As we have evolved, so too has the world around us. The user’s experience is no longer restricted to the desktop. It has expanded to the user’s hands, wrists, glasses and more and will continue to evolve into areas we have yet to imagine.
In KDE we want to be in charge of our future. We want to have a clear and honest approach for reacting to and influencing our shifting environment, to continuously and consciously improve. We want to do what is necessary to be the thriving community for creating technology that will satisfy the needs of the next 20 year’s users.
In order to shape our evolution it is crucial that the wider KDE community understands its current position and where it aims to be in the future. As the primary support structure within the KDE community, KDE e.V. is instrumental in guiding that journey. Through regular honest assessment and reaction to our environment the KDE community continues to remain effective and relevant and ensures that KDE’s users will continue to Experience Freedom.
In order to provide the KDE community with the means to assess its current position and find future direction we have devised this yearly iterative process:
- First, we will gather extensive input from the wider community: everyone from core contributors to casual contributors to users. This will be done via various means, the main one being a survey, but also including forums, mailing lists, IRC office hours and in-person meetings; for example at Akademy.
- This input is then consolidated into a report. It is going to be published before Akademy for public consumption. This report will summarize community conclusions and potential areas of focus and improvement.
- During KDE e.V.’s annual general assembly, the report is discussed and some of the recommended focus areas are agreed on as goals.
- At a strategy sprint, core community members come up with measureable suggestions to achieve those goals.
- Finally, there will be a wrap up session that will evaluate how much progress we have made towards the goals we set ourselves. The evaluation will be presented at the next general assembly meeting.
KDE e.V. will support this process and its outcome. The outcome of this process are happy contributors and happy users.
I’d love for you to be a part of this process. As a first step please help us by taking the time to fill out the survey. Further information will be published on evolve.kde.org. If you have questions please ask them on the KDE community mailing list.
… a KDE PIM sprint happened in Toulouse! And what happened during that sprint? Well, read this wholly incomplete report!
Let’s start with the most important part: we decided what to do next! On the last PIM sprint in Munich in November when Christian and Aaron introduced their new concept for next version of Akonadi, we decided to refocus all our efforts on working on that which meant switching to maintenance mode of KDE PIM for a very long time and then coming back with a big boom. In Toulouse we discussed this plan again and decided that it will be much better for the project and for the users as well if we continue active development of KDE PIM instead of focusing exclusively on the “next big thing” and take the one-step-at-the-time approach. So what does that mean?
We will aim towards releasing KF5-based KDE PIM in August as part of KDE Applications 15.08. After that we will be working on fixing bugs, improving the current code and adding new features like normally, while at the same time preparing the code base for migration to Akonadi 2 (currently we call it Akonadi Next but I think eventually it will become “2”). I will probably write a separate technical blog post on what those “preparations” mean. In the meantime Christian will be working from the other side on Akonadi 2 and eventually both projects should meet “in the middle”, where we simply swap the Akonadi 1 backend with the Akonadi 2 backend and ship next version. So instead of one “big boom” release where we would switch to Qt 5 and Akonadi 2 at the same time we do it step-by-step, causing as little disruption to user experience as possible and allowing for active development of the project. In other words WIN-WIN-WIN situation for users, devs and the KDE PIM project.
I’m currently running the entire KDE PIM from git master (so KF5-based) and I must say that everything works very well so far. There are some regression against the KDE 4 version but nothing we couldn’t handle. If you like to use bleeding-edge versions of PIM feel free to update and help us finding (and fixing) regressions (just be careful not to bleed to death ;-)).
Another discussion we had is closely related to the 15.08 release. KDE PIM is a very huge code base, but the active development team is very small. Even with the incredible Laurent Montel on our side it’s still not enough to keep actively maintaining all of the KDE PIM (yes, it’s THAT huge ;-)). So we had to make a tough decision: some parts of KDE PIM have to die, at least until a new maintainer steps up, and some will move to extragear and will live their own lives there. What we release as part of KDE Applications 15.08 I call KDE PIM Core and it consists of the core PIM applications: KMail, KOrganizer, KAddressbook, Kleopatra, KNotes and Kontact. If your favorite PIM app is not in the list you can volunteer as a maintainer and help us make it part of the core again. We believe that in this case quality is more important than quantity and this is the trade-off that will allow us to make the next release of PIM the best one to date ;-).
Still related to the release is also reorganization of our repos, as we have some more splitting and indeed some merging ahead of us but we’ll post an announcement once everything is discussed and agreed upon.
Thanks to Christian’s hard work most of the changes that Kolab did in their fork of KDE PIM has been upstreamed during the sprint. There are some very nice optimizations and performance improvements for Akonadi included (among other things), so indeed the next release will be a really shiny one and there’s a lot to look forward to.
Vishesh brought up the topic of our bug count situation. We all realize the sad state of our PIM bugs and we talked a bit about re-organizing and cleaning up our bug tracker. The clean up part has already begun as Laurent with Vishesh have mass-closed over 850 old KMail 1 bugs during the sprint to make it at least a little easier to get through the rest. Regarding the re-organization I still have to send a mail about it but a short summary would be that we want to remove bugzilla components and close bugs for the apps we decided to discontinue and maybe do a few more clean up rounds for the existing bugs.
I’m sure I’ve forgotten something because much more happened during the sprint but let’s just say I’m leaving some topics for others to blog about ;-).
Huge thank you to Franck Arrecot and Kevin Ottens for taking care of us and securing the venue for the sprint! All in all it was a great sprint and I’m happy to say that we are back on track to dominate the world of PIM.
The only disappointment of the entire sprint was my failure to acquire a French beer. I managed to try Belgian, Spanish, Mexican and Argentinian beer but they did not serve any French beer anywhere. Either there’s no such thing or it must be really bad…:-)
If you haven’t heard, there are a few KDE projects testing Phabricator for patch review and project management.
KActivities are amongst those.
So, from now on, if you want to provide patches, you are advised to do that through http://phabricator.kde.org/ and the Arcanist tool instead of the ReviewBoard.
We are also going to organize our work tasks on Phabricator instead of todo.kde.org.
The ultimate goal is to test whether Phabricator is a viable alternative to a few of our services.
Let’s review what I’ve done to KDevelop’s kdev-cppcheck and kdev-valgrind plugins lately.
This is fairly straightfoward: These plugins were still using the old .desktop plugin manifest files. Now they are using the embedded JSON manifests. This isn’t something user visible, but it’s needed as the old .desktop method is now deprecated.Added the number of calls to the callgrind output of kdev-valgrind
Until now the callgrind output has only shown the IR and Inclusive IR fields. Now is shows the number of calls as well. Take a look at the pictures!
Until now kdev-valgrind’s memcheck output unfortunately didn’t show enough of the callstacks to be really useful. You couldn’t see where the problem exactly occured, or where it was stemming from! Now it shows the full backtrace + the auxilliary trace as well, so you can see what actually causes the problems. See the pictures!
As you might know, Calligra now also started porting to Qt5/KF5. We are currently reaching the end of stage 1, where everything is readded to the build (“links and installs? done!”), with next stage then to fix anything that broke or regressed (see screenshots!1!).
Just, we also now see that noone in the current set of active Calligra developers has enough love left for Braindump, the notetaking and mindmapping application from the creativity and productivity suite Calligra.
So as it stands Braindump would be left behind during the porting phase and be discontinued, for now at least :(
Braindump is a nice example for the flexibility of the Calligra architecture, where objects are implemented by so called “Shape” plugins, which then are available to any application supporting “Shape”s in general. The actual code of Braindump itself is centered around the concept of whiteboards with unlimited canvas, where one can drop all possible kind of objects (the “shapes”) and then note their relations. With automated saving in the background, no need for any “Save” button.
Cyrille, who has developed Braindump, says:
“I am still interested in the application itself, but what it really needs is a better user interaction, and flake [name of the Shape system, ed.] is not flexible enough to provide it, and I don’t have the energy to make it flexible enough”.
He and the rest of the Calligra team will happily assist someone who ideally already uses Braindump and now would like to overtake the future development for the Qt5/KF5 based version, to enhance their workhorse. And the porting time is a good time to get to know the current system: for the first Qt5/KF5 based Calligra release, 3.0, we are concentrating on a pure port, so no new features or refactoring (ignore the exceptions ;) ), only minimal changes. And envisions the options after the port/3.0: e.g. get Braindump to run on your Android or Sailfish OS tablet! Connect it to syncing servers like ownCloud! Or whatever would enhance your Braindump usage.
And all done while enjoying the synergy effects from the shared libs and plugins of the Calligra suite.
Your chance, now :) Don’t hesitate too long, as Braindump will bitrot more and more, once the 3.0 release is done and the Calligra libs will see more refactoring.
Find us in the channel #calligra on irc.freenode.net, or join the mailing-list firstname.lastname@example.org.
In the beginning, I used Blogger. But it was limiting, and had its share of problems.
After a while, I decided to use my own host (ivan.fomentgroup.org) and switch to a mixture of Folite (a small CMS I wrote for my other sites) and WordPress (for the blog section).
This combination has provided my online presence (to use the marketing-speak :) ) since 2009. During that time, WordPress started becoming a huge beast that I had to reinstall quite a few times, and each time to triage which plugins broke it. WP has streamlined its interface, the admin section and everything, but it is quite difficult to manage if you need to dive into its code base.
So I decided to make a change. I decided to switch to a new domain (cukic.co) and to new blogging software.
I looked into Ghost (due to Aaron’s recent switch from Blogger to Ghost), Anchor, Wardrobe and a few other simpler blogging solutions. They seemed nice, but either needed Node.js, or something similar that is not supported by my hosting provider; or they just did not work for some reason.
Then I found Jekyll.Totally unrelated: BBC's Jekyll, awesome TV show
Jekyll is a simple website generator which allows you to create templates, different page layouts with all the power of Ruby and Liquid, to write content in Markdown (and others), and it generates static html files from those that you then just need to upload to the server.
The Kdenlive team is happy to announce the release of Kdenlive 15.04.0. While there are very few new features in this release, it is a huge step towards a bright future!
This has several implications:
- We fully benefit from KDE's infrastructure, which means less worries for the developpers.
- We stick to KDE Applications release schedule, which means one bugfix release every month, one feature improved version every 4 months. That's a big change from the previous random release every year or so. This is possible because the KDE team takes care of the release, not our small dev team, so a big thank you to KDE.
- We now use KDE's bugtacker at https://bugs.kde.org.
- We benefit from KDE's build servers and team, which means that we might in the future have Mac OS and Windows versions without too much efforts from the dev team.
- We can now be part of the Google Summer Of Code.
- We have adopted the KDE Applications numbering scheme. From now on, Kdenlive versions will be numbered with Year.Month.Bugfix. That explains the 15.04.0 version.
- Every KDE contributor can help us improve Kdenlive.
Most of the work for this release was porting the code to Qt5 and KDE Frameworks 5 (KF5). While users will not see direct benefit, this makes us ready for the next big steps. Changes in this version include:
- Since we are now based on Qt5/KF5, you NEED KDE Frameworks 5 to run Kdenlive.
- Fixed video stabilization
- Auto save new projects
- Download new render profile feature fixed
You can download the source code, binary packages for your distro should hopefully be prepared by distibution packagers.What will change in the near future
While Kdenlive 15.04.0 is mostly a Qt5/KF5 port, we have many new features/improvements in preparation for the 15.08.0 release. Here are some of the features that we are currently working on:
- Finally integrate some of Till Theato's work resulting from our Indiegogo campain. It took us 2.5 years but we are finally merging parts of the refactoring effort.
- Use OpenGL for the video display, bringing back experimental GPU display and effects
- Add effects to Project clips: for example, add color correction to the clip in the Project Bin. Every instance of the clip that is then used in timeline will have that color correction.
- Cleaning the code to make it easier to understand.
That's it for today, I probably forgot many things but that might be an excuse to blog more often :).