Answer from cs61c-cb (minh uyen nguyen 16765774) for Question 3 The major difference between computer number and number in the real world is that computer numbers have limited sizes hence limited precision. Therefore, in the old days, many programmers wrote (0.5 - x) + 0.5 instead of 1.0 - x to be more PRECISE.