Python returns MagicMock object instead of return_value
I have a python file
a.py which contains two classes
class A(object): def method_a(self): return "Class A method a" class B(object): def method_b(self): a = A() print a.method_a()
I would like to unittest
method_b in class
B by mocking
A. Here is the content of the file
testa.py for this purpose:
import unittest import mock import a class TestB(unittest.TestCase): @mock.patch('a.A') def test_method_b(self, mock_a): mock_a.method_a.return_value = 'Mocked A' b = a.B() b.method_b() if __name__ == '__main__': unittest.main()
I expect to get
Mocked A in the output. But what I get is:
<MagicMock name='A().method_a()' id='4326621392'>
Where am I doing wrong?
@mock.patch('a.A'), you are replacing the class
A in the code under test with
B.method_b you then set
a = A(), which is now
a = mock_a() - i.e.
a is the
mock_a. As you haven't specified this value, it's a regular
MagicMock; this isn't configured either, so you get the default response (yet another
MagicMock) when calling methods on it.
Instead, you want to configure _the
mock_a_ to have the appropriate method, which you can do as either:
mock_a().method_a.return_value = 'Mocked A' # ^ note parentheses
or, perhaps more explicitly:
mock_a.return_value.method_a.return_value = 'Mocked A'
Your code would have worked in the case
a = A (assigning the class, not creating an instance), as then
a.method_a() would have triggered your mock method.