## Section4.4Bitwise Logical Operations

The C/C++ bitwise logical operators are:

 and & inclusive or | exclusive or ^ complement ~

It is easy to see what each of these operators does by using truth tables. To illustrate how truth tables work, consider the algorithm for binary addition. In Section 3.1 we saw that the $$i^{th}$$ bit in the result is the sum of the $$i^{th}$$ bit of one number plus the $$i^{th}$$ bit of the other number plus the carry produced from adding the $$(i - 1)^{th}$$ bits. This sum will produce a carry of zero or one. In other words, a bit adder has three inputs—the two corresponding bits from the two numbers being added and the carry from the previous bit addition, and two outputs—the result and the carry.

In a truth table we have a column for each input and each output. We write down all possible input bit combinations and then show the output(s) in the corresponding row. A truth table for the bit-by-bit addition of z = x + y is shown in Table 4.4.1. We use the notation x[i] to represent the $$i^{th}$$ bit in the variable x.

The bitwise logical operators act on the corresponding bits of two operands as shown in Table 4.4.2Table 4.4.5.

Make sure that you distinguish these bitwise logical operators from the C/C++ logical operators, &&, ||, and !. The logical operators work on groups of bits organized into integral data types rather than individual bits. For comparison, the truth tables for the C/C++ logical operators are shown in Table 4.4.6Table 4.4.8.

Now we are prepared to see how we can convert from the ASCII character code to the int format. The & operator works very nicely for the conversion. First, look at the comparison between each hexadecimal character and its corresponding integer value in Table 4.4.9.

For the characters $$\hex{0}$$—$$\hex{9}$$ the conversion from the ASCII value to an int value can be easily done with the code:

aChar = aChar & 0x0f;
anInt = (int)aChar;


where aChar is a char variable and anInt is an int variable. The & operation used this way is often called masking. The (int) operation type casts the value stored in aChar to be an int value, effectively extending it to be 32 bits.