Introduction: what are you talking about ?
Ok so let me explain this long title a little bit more. We have various Drupal sites: let’s call one the hub and the other ones the clients. The requirements are the following:
- Users shall be able to login using the same username and password on all sites (hub and clients)
- When a user updates his profile on a client, changes shall be reflected on the hub and on the other clients
- When a user updates his profile on the hub, changes shall be reflected on the clients
These three requirements, even though they are quite simple to express, are really hard to realize, at least if you want to use industry standards and make it easily extensible to new clients
Before we go on, please note that the solution presented in this article is still experimental !
I found some similar use cases on the Internet, two of which were really interesting: Solving problems through collaboration and Simple Sign-On with OpenID. Thanks to Palantir.net and Developmentseed for sharing these use cases.
Just like the guys at Palantir.net, we looked at Deploy and found out, unfortunately after some development had already happened, that it wasn’t a good solution for our use case. We therefore drifted towards a solution based on OpenID for user login and PubSubHubBub for content synchronization. However, unlike the guys at developmentseed, we needed to synchronize not only user accounts but also their profiles, and we needed bi-directional synchronization.
OpenID Simple Sign On
We implemented the OpenID simple sign on using the process described in the article Simple sign on with OpenID.
- On the hub, download and install the openid_provider_sso module. Note that the module you will find on my github repository is the same as the one you will find in DevelopmentSeed’s article, the only difference being a small message explaining the login process to the user.
- On the client, download and install the OpenID SSO module. Again, this module is the same as the one you will find on DevelopmentSeed’s article.
- Following the screencast in DevelopmentSeed’s article, add a relying party on the hub and add the hub’s address on the client
- Depending on how your Drupal installation is set up, you might want to apply the patch of this issue to your OpenID module
You should now have a working installation for a simple sign on with OpenID.
User account synchronization
This solution assumes you are using Content profile, and therefore that a user account is basically a username, an email address and an openid. Also, please note that one limitation of this solution is that it limits the number of OpenIDs per user to 1.
- Download and install the following modules:
- On both the hub and the client: Keyauth. Again, this module comes from DevelopmentSeed’s article, however, for now, URL key authentication is not used in this solution, and will be one of the improvements needed to this solution.
- On the hub: PuSH user. Again, this module comes from DevelopmentSeed, with a few small modifications.
- On the client: Sync User. This module comes from DevelopmentSeed as well, but this time, I did quite a lot of modifications on it to allow for user profile synchronization.
- Once these modules are installed, you should not be able to change your email and/or username on the client, and a message should appear saying that you need to change these on the hub. Synchronization of user accounts is only one way (from the hub to the client), however don’t worry, user profile synchronization is two-way.
- Try to change your email address on the hub: the client shall be notified and the email address shall be changed on the client as well. Make sure this works fine before going to the next step.
User profile synchronization
Now this is where most of the added value of this solution comes in. Up until now, it was basically a repeat of DevelopmentSeed’s article.
- Download and install the following modules on both the hub and the client:
- Views Atom: this module allows you to display Drupal nodes in Atom format using Views. User profiles will be pushed to the hub and clients in Atom feeds.
- Feeds Atom: this module parses Atom feeds for insertion using Feeds. Note that both of these modules (Views Atom and Feeds Atom) were provided by Palantir.net for this use case I talked about at the beginning of this article.
- PubSubHubbub pusher: this module allows you to configure which content types you want to push, when and how.
- Sync Nodes: this module adds a Processor to Feeds which allows Node synchronization.
- Apply some patches on the Views Atom and Feeds Atom modules. The patch for Views Atom can be found here: it changes the way the guid is generated (in order to keep the same guid on the hub and the clients, and therefore allow synchronization) and adds a new view style called “RDF Nodes (Custom)” which allows you to define your own mapping of RDF properties. The patch for Feeds Atom can be found here: it adds a parser to Feeds which allows you to define your own mapping sources using a text field.
- I’ll refer you to the screencast you will find at the bottom of this article for details on how you should configure your views and feed importers. The following is just an outline:
- Configure the view of your user profiles on the client and the hub. On the hub, you can choose “RDF (Nodes)” as the row style, but on the client, you need to choose “RDF (Nodes) Custom” and define the label of each field you want to show to be the same name as the label of the hub.
- Configure the feed importers on both the client and the hub. Look at the screencast really carefully here because there are some small tricks that will need to be fixed in the future.
- Subscribe the profile feed importer of the client to the hub doing an import: again, this will need to be fixed in the future…
- It should be done. If you followed the screencast really carefully, you should have a working bi-directional user synchronization.
Part 1 – Setting up OpenID
Part 2 – Setting up user account synchronization
Part 3 – Setting up user profile synchronization on the hub
Part 4 – Setting up user profile synchronization on the client
Part 5 – It works !
Limitations and improvements
This solution is still experimental, obviously, and has various limitations and needs improvements:
- User deletion: this solution only handles user inserts and updates. It should also handle user deletions.
- Signed URLs: even though DevelopmentSeed’s article uses keyauth in order to sign each URL, I’m deactivating it here, because for some reason, it didn’t work when importing user profiles… It will need to be reactivated in the future.
- As Drupal is going towards RDF, it would probably be better to use RDF instead of Atom feeds, and therefore use the RDF module to generate the views instead of views atom and feeds atom…
- Support various OpenIDs: for now, this solution assumes that users only connect to the hub through OpenID. They can not connect through any other OpenID provider…
Conclusion: lots of advantages…
I believe using industry standards such as OpenID, Pubsubhubbub and Atom is a great way to synchronize Drupal sites, and can obviously be applied to any type of node. There are probably other ways to do this, but I believe this one is the closest one to Drupal’s philosophy of adopting more and more RDF…
The second part of this implementation can be seen on Guillaume’s site: http://www.viguierjust.com/en/2011/02/06/drupal-6-complex-node-synchronization-with-pubsubhubbub/
Las diapositivas de la conferencia sobre la certificación Zend se pueden descargar aquí. Gracias a todos por su participación.
Introduction: the requirements
A customer of ours wanted a website which could manage the integration of just about any external content (meaning video, audio, images, but also Google docs, slides from Slideshare and books from Scribd) in a ‘Resource’ content type. There are multiple ways to do this in Drupal, however there are lots of possible pitfalls.
The possible solutions
I investigated and looked at the following solutions:
- The Embedded Media Field module: the module creates CCK field types that allow you to add EITHER video, audio or image from different content providers. The module has the following problems however:
- It creates 3 CCK field types: Embedded Audio Field, Embedded Video Field and Embedded Image Field. There is a patch here that creates an Embedded generic field which supports all media types, however the patch didn’t work in my case.
- You can “only” embed audio, video and images, so if you want to embed more (like Slideshare or Google Docs content), you will have to find another solution.
- Each time you want to add a new provider, if it’s not already provided as a Drupal module, you have to write a small module.
- The Media module will be THE media embedding solution in the future, however, at the time of writing this article, it is still unstable and under heavy development.
- Finally, the last solution that came to my mind was to allow users to simply embed some HTML code within a Resource. They would simply go to their YouTube video or Slideshare slides or whatever and copy and paste the embed code provided by the website, and all done.
Security pitfall: what NOT to do
It would be very tempting for someone who doesn’t know about Cross site scripting to simply allow his users to use “Full HTML” within their posts. That would allow your users to embed about anything however they want to embed it, without further configuration needed. However, for obvious reasons if you know about cross site scripting issues, you do not want to allow that, and you want to find a more secure way of doing it.
In order to do it securely, you will basically need one single module: Embed filter. This module filters various embed tags (embed, object and script) based on a hostname, so for example you can say that you allow your users to embed anything that comes from youtube.com, but you won’t allow them to embed from any other website. This solves the cross site scripting issues while still allowing your users to embed just about anything.
If you want to configure it in the same way that I’m going to show you in the following lines, you will also need the Better Formats module.
How to configure it ?
First, you obviously need to create a ‘Resource’ content type. Allow your resource to have a Title and a Description.
Then, in your resource content type, create a text field labeled Embed HTML for example.
Then, you will need to create a new Input format. Site configuration > Input formats > Add input format. I called mine Filtered HTML with embed. When you configure it, make sure you activate the Object and embed tag filter which should be present if you have the Embed filter module activated. Also, when you configure it, make sure you allow the <object>, <script>, <embed> and <param> tags in the configuration of the HTML filter.
Then, you will need to allow your users to use this new format. Once you have done this, you will see that your users will be prompted to choose between the Filtered HTML format or the Filtered HTML with embed format. What you probably want is to remove this prompt, and allow the Filtered HTML with embed for the Embed HTML CCK field you created, but only allow the Filtered HTML format for the other fields (like the Description of the resource for example). This is where the Better Formats module kicks in.
Go to Input Formats and in Settings, check “Control formats per node type”. Then, you will need to apply this patch to the Better Formats module. Then go to your Modules list and activate the “Better Formats additional CCK text widgets” module. Go back to your resource content type and edit the ‘Embed HTML’ field you created. Click on ‘Change basic information’ and choose ‘Text field using Better Formats module’ as the Widget type. In the configuration of the field, you should now see an ‘Input format settings’ collapsible box. Only allow the ‘Filtered HTML with embed’ in the list of allowed formats.
Finally, in order to remove those collapsible boxes that allow users to choose their input format, go to the User permissions list and, in the ‘better_formats module’, uncheck the permission to ‘collapsible format selection’.
Configuring the embed filter module
Finally, you will need to configure the embed filter module. Go to Site configuration > Embed filter and add whatever allowed hosts you want to allow in the list. Be careful as, sometimes, some providers do not host their content at the same address as their site. For example, if you want to allow Slideshare, you need to add slidesharecdn.com in your allowed hosts instead of slideshare.com.
Also note that if you want to allow img tags or iframe tags (Google uses iframes to embed their presentations and google docs), you will need to apply this patch to the embed filter module.
Conclusion: or help Media…
You can either do what’s explained in this article, or help the development of the Media module, because everything I wrote in this article will become obsolete when the Media module becomes stable. So waiting and helping the development of the Media module can also be a good option…
A customer wanted a website with some event management that would include:
- A public, general calendar with all of the events
- A per-group calendar for each group of users that would display the group events
- The possibility for group members to create private events, accessible only to the other group members
- A per-user calendar that would display the events created by a specific user
- The possibility for users to create private events, that would be displayed only in their calendar and not viewable by any other user
I believe this set-up can be quite common for any relatively large social website, but I didn’t find any extensive documentation for it.
What NOT to do: the event and OG Calendar module
At first, I thought about installing the Event module, along with the OG Calendar module for the per-group calendars and the Calendar module for the per-user and public calendars. Turns out this is a bad idea, and there are many reasons why:
- The Calendar and Event modules don’t go together: Calendar is based on Views, while Event isn’t. Even when using Event Views, the Calendar module can not display events coming from the Event module (or I missed something, which is possible).
- The Calendar and Event modules have two different ways of displaying their calendar, which would be double work for the designer if you were to use both modules
- The OG Calendar module is no longer being maintained
Here is the list of modules I needed to realize the setup:
- CCK: obviously needed, to create a node of type Event, you should already have it installed
- Views: required by Calendar, you should already have it installed
- Organic groups: this should be part of your installation as well, if you need a per-group calendar
- Private: to allow users to create private events
Understanding Calendar and first steps
The Calendar module is smart: it uses a custom content type that you have to create (you can call it ‘event’) to display events in a calendar display. It basically adds the ‘Calendar’ style to the Views module. So basically, what you have to do is to create a content type to manage your events (which should include a date field, obviously), and then create a view to display the events.
Sean Effel, from Drupaltherapy, shows the detailed steps on how to achieve this in his screencast. Just follow the link and watch the video.
Setting up a per-group calendar
What Sean doesn’t show in his screencast however, is how to set up a per-group and a per-user calendar. If you followed the video, you should now have a public/general calendar showing all of your events.
Creating some test data
In your Drupal installation, create 4 events entitled: ‘Private user event’, ‘Public user event’, ‘Private group event’, ‘Public group event’. The idea is that, at the end of this article, you should have a public calendar displaying the ‘Public user event’ and ‘Public group event’, a group calendar displaying (when you are logged in as a group member) the ‘Public group event’ and ‘Private group event’ and finally a user calendar displaying (when you are logged in as the user who created the ‘Private user event’) the ‘Private user event’ and ‘Public user event’.
Setting up Organic Groups
If you want to have private group events, you simply have to check “Visibility chosen by author/editor …” under “Visibility of posts” in “Organic groups access configuration”. This will allow the event author to create private group events, visible only by group members.
Setting up the per-group calendar view
Go to Site building > Views > List. Select your general calendar view (which should be called calendar) and click on Clone. As View name, use ‘calendar_og’, enter a description such as ‘per-group calendar view’ and click Next.
Now I had a problem which seems to me like a bug: after you cloned the view, if you try to get a Preview of the “Calendar page” display, you will only get the Navigation bars. In order to change this, change the style plugin to “Calendar” in “Defaults”, AS WELL AS in “Year view”, “Month view”, “Day view” and “Week view”.
Then, you will need to change the Path of this new view. Select “Calendar page” > Page settings > Path and change it to ‘node/%/calendar’. Also, if you wish, set the Menu to “Menu tab” and entitle it “Calendar”.
You now need to add an argument to the view in order to tell it from which group it should retrieve the events. Add an argument of the type “Organic groups: Groups”, select “Provide default argument” and “Node ID from URL”. You can also select a “Group nodes” validator if you wish. DON’T FORGET, after you added this argument, to organize it so that the FIRST argument is the “Organic groups: Groups” one, and the second is the date argument, otherwise the view will not work.
If you go to a group page, you should now have a tab saying “Calendar” or, if you didn’t set a Menu tab, you can navigate to http://urlofyoursite/node/%gid%/calendar, and you should see a calendar displaying only the events of the group identified by the id given in the URL.
Edit the ‘Private group event’ you created earlier and make it private, using the “Organic groups” option, and it should not appear in the group calendar, unless you are logged in as a group member.
Setting up a per-user calendar
Almost the same steps can be followed in order to set up a per-user calendar, you simply have to change a few things:
- Change the name of the cloned view to something like ‘calendar_my’ or ‘calendar_user’
- Change the path of the Calendar page to ‘user/%/calendar’
- Change the “Organic groups: Groups” argument to a “User: Uid” argument and in the “Provide default argument” select “User ID from URL”. Don’t forget, once again, to put the argument first in the list
At the end of these steps, you should be able to see a calendar when you navigate to http://urlofyoursite/user/%uid%/calendar, showing only the events created by the user identified by %uid%.
User private events
In order to allow users to create private events (ie accessible by themselves only), I used a very simple module: Private. There might be some other modules that can do the same thing and more, but Private simply does the job. It allows you to let your users say whether they want a specific node to be private or not. Once the module is installed, edit the ‘event’ content type you created and, under ‘Workflow settings’, set the ‘Privacy’ option to ‘Enabled (public by default)’. This will enable a checkbox in the event creation form, allowing your users to define an event as private.
Go back to the ‘Private user event’ you created earlier and make it private. It should now only show in your per-user calendar if you are connected as the author of the event.
Conclusion: views power
Views is a very powerful module and the idea of using it through the Calendar module is great. Given that your event content type is a simple node, you can also easily add it some more things, such as geolocalization using the GMap + Location module.
As Drupal is moving to the localization server localize.drupal.org, here’s a small tip that took me a little while to find, but can be very useful in the end, as it will allow you to contribute your translations back to the Drupal community more efficiently.
Let’s say you install a module in your own Drupal website. However, this module is not fully translated for whatever language your website is in. You can, obviously, use Drupal’s translation interface on your own website and translate from here whatever is missing. The problem is: that won’t benefit to the Drupal community. So then you might think about going to localize.drupal.org, translating here whatever strings you need to translate, and then doing an export to a .po file of the module translation strings. The problem would be that right after you’re done translating on localize.drupal.org, your strings will need to wait for the approbation of a moderator before they can be exported to a .po file. So, how can you translate those strings once, have them immediately installed in your own Drupal website and at the same time give the translations to the community ?
Here is how to achieve this:
- First, go to localize.drupal.org and Export the module with the strings you need to translate in a .po file, using “All in one file” as an option
- Second, translate the strings in the .po file you just downloaded (using a program like PoEdit can be convenient)
- Finally, import the .po file BOTH in your own Drupal installation AND in localize.drupal.org: you will immediately have the strings translated in your own Drupal installation without having to wait for a moderator approval, and at the same time your translations will be uploaded to localize.drupal.org, helping the community
Imagine you have two servers: on one server, you have a Mercurial repository with all your code, accessible through http. On the other one, you have a working copy of this code that you are using to test a website. Let’s call the first server REPOS and the second one DEVEL. This article describes how to automatically update the working copy on DEVEL, after an
hg push on REPOS.
In order to do this, the idea is to write a Mercurial hook that will execute after a
hg push on REPOS and will connect through ssh to DEVEL in order to issue the necessary
hg pull and
hg update. It’s fairly easy to do, but it needs some setup on both DEVEL and REPOS in order to work.
Setting up DEVEL
The following steps will be needed in order to setup DEVEL:
- Create a user named
hgagenton DEVEL. The account of this user will be used to connect by ssh from REPOS in order to do the
sudo adduser hgagent
- If REPOS requires a username and password (or an SSH key) in order to do an
hg pull, configure the
authsection of the
hgrcfile of the hgagent user you just created in order to allow it to do an
hg pullwithout needing to provide his username and password interactively
- Do an
hg cloneof the repository you want to update using the hgagent user (this assumes, obviously, that the repository is already accessible on REPOS), so that the files in the clone belong to hgagent, which will therefore be able to do
Setting up REPOS
The following steps are needed in order to set up REPOS:
- If your repository is not publicly accessible, you will need to set it up so that
hgagentcan access it using a username and password or an SSH key
- You will need to generate a pair of SSH keys, in order to allow a non-interactive connection to DEVEL. If your repository uses http or https, I strongly advise you to generate these SSH keys as the user under which your http server runs (most probably
www-data), as when the hook is executed, it will be executed under the
www-datauser (or whatever user your http server runs on). You can use
sudo -u www-data ssh-keygenin order to generate the keys. Note that you shouldn’t create a passphrase for your SSH key.
- Once the keys are generated, you need to copy the public key to the
/home/hgagent/.ssh/authorized_keys. You can use the following command in order to do so:
ssh-copy-id -i /var/www/.ssh/id_dsa.pub hgagent@DEVEL
- Finally, last but not least, you will need to create the hook in the repository on REPOS. In the .hg/hgrc file of the repository, add the following lines:
[hook] changegroup = ssh hgagent@DEVEL "cd /path/to/working/copy; hg pull; hg update"
That’s it ! When you issue a
hg push on REPOS, it should now automatically connect by ssh to the DEVEL server and update your working copy.