Pandas aggregate count distinct

Let's say I have a log of user activity and I want to generate a report of total duration and the number of unique users per day.

    import numpy as np
    import pandas as pd
    df = pd.DataFrame({'date': ['2013-04-01','2013-04-01','2013-04-01','2013-04-02', '2013-04-02'],
        'user_id': ['0001', '0001', '0002', '0002', '0002'],
        'duration': [30, 15, 20, 15, 30]})

Aggregating duration is pretty straightforward:

    group = df.groupby('date')
    agg = group.aggregate({'duration': np.sum})
    agg
                duration
    date
    2013-04-01        65
    2013-04-02        45

What I'd like to do is sum the duration and count distincts at the same time, but I can't seem to find an equivalent for count_distinct:

    agg = group.aggregate({ 'duration': np.sum, 'user_id': count_distinct})

This works, but surely there's a better way, no?

    group = df.groupby('date')
    agg = group.aggregate({'duration': np.sum})
    agg['uv'] = df.groupby('date').user_id.nunique()
    agg
                duration  uv
    date
    2013-04-01        65   2
    2013-04-02        45   1

I'm thinking I just need to provide a function that returns the count of distinct items of a Series object to the aggregate function, but I don't have a lot of exposure to the various libraries at my disposal. Also, it seems that the groupby object already knows this information, so wouldn't I just be duplicating effort?

How about either of:

    >>> df
             date  duration user_id
    0  2013-04-01        30    0001
    1  2013-04-01        15    0001
    2  2013-04-01        20    0002
    3  2013-04-02        15    0002
    4  2013-04-02        30    0002
    >>> df.groupby("date").agg({"duration": np.sum, "user_id": pd.Series.nunique})
                duration  user_id
    date                         
    2013-04-01        65        2
    2013-04-02        45        1
    >>> df.groupby("date").agg({"duration": np.sum, "user_id": lambda x: x.nunique()})
                duration  user_id
    date                         
    2013-04-01        65        2
    2013-04-02        45        1

From: stackoverflow.com/q/18554920