And, of course, this post happens to be my 1,024th.
From the Wikipedia, a description of 1024, as used in computers:
In binary notation, 1024 is represented as 10000000000, making it a simple "round number" occurring frequently in computer applications.
1024 is the maximum number of computer memory addresses that can be referenced with ten binary switches. This is the origin of the organization of computer memory into 1024-byte chunks.
Thanks to Roy for the idea.