one time a CS professor took points off an assignment (build interpreting calculator in assembly) and I had to bring in my number theory textbook to demonstrate that it is in fact the C language that does division incorrectly

C either rounds towards 0 on integer division or maybe does something implementation specific if the denominator is negative?


the "correct" definition of integer / and % is:
a / b = x, a % b = r, where
a = bx + r
0 <= r < |b|

(remainder is always positive, which requires rounding towards negative infinity)

*math is made up and you can define division however you want, unless you want algebra to work, then there's some rules you gotta follow


@octopus Quality post and the last part is a good reminder.
Even if you like something, it's good and important to know the limits of it.

Sign in to participate in the conversation

Fosstodon is an English speaking Mastodon instance that is open to anyone who is interested in technology; particularly free & open source software.