Clear the least significant set bit

Question | Aug 9, 2017 | rparekh 

The byte x is initialized to 0x14 (20 decimal or 00010100 binary):

byte x = 0x14;  // 00010100
// clear least significant set bit
// ____?____    // 00010000

Which one of following bitwise operations on x can clear the least significant set bit in x. After the bit is cleared x should be 0x10 (00010000). Note that the correct choice is a generic solution that works on any value, and on any size integer ( int, byte or short ).