Python returns MagicMock object instead of return_value

I have a python file which contains two classes A and B.

    class A(object):
        def method_a(self):
            return "Class A method a"

    class B(object):
        def method_b(self):
            a = A()
            print a.method_a()

I would like to unittest method_b in class B by mocking A. Here is the content of the file for this purpose:

    import unittest
    import mock
    import a

    class TestB(unittest.TestCase):

        def test_method_b(self, mock_a):
            mock_a.method_a.return_value = 'Mocked A'
            b = a.B()

    if __name__ == '__main__':

I expect to get Mocked A in the output. But what I get is:

    <MagicMock name='A().method_a()' id='4326621392'>

Where am I doing wrong?

When you @mock.patch('a.A'), you are replacing the class A in the code under test with mock_a.

In B.method_b you then set a = A(), which is now a = mock_a() - i.e. a is the return_value of mock_a. As you haven't specified this value, it's a regular MagicMock; this isn't configured either, so you get the default response (yet another MagicMock) when calling methods on it.

Instead, you want to configure _thereturn_value of mock_a_ to have the appropriate method, which you can do as either:

    mock_a().method_a.return_value = 'Mocked A' 
        # ^ note parentheses

or, perhaps more explicitly:

    mock_a.return_value.method_a.return_value = 'Mocked A'

Your code would have worked in the case a = A (assigning the class, not creating an instance), as then a.method_a() would have triggered your mock method.