Canadian Computing Competition: 2008 Stage 1, Senior #2
The game "Pennies in the Ring" is often played by bored computer programmers who have gotten tired of playing solitaire. The objective is to see how many pennies can be put into a circle. The circle is drawn on a grid, with its center at the coordinate . A single penny is placed on every integer grid coordinate (e.g., , , etc.) that lies within or on the circle. It's not a very exciting game, but it's very good for wasting time. Your goal is to calculate how many pennies are needed for a circle with a given radius.
Input
The input is a sequence of positive integer values, one per line, where each integer is the radius of a circle. You can assume the radius will be less than or equal to . The last integer will be indicated by . You may assume that the grid is large enough for two pennies to be on adjacent integer coordinates and not touch.
Output
You are to output, each on its own line, the number of pennies needed for each circle. You do not need to output for the last . You may assume that the number of possible pennies is less than 2 billion (which is only $20 million dollars: computer scientists have lots of money).
Sample Input
2
3
4
0
Output for Sample Input
13
29
49
Comments
FYI, creating a square that fully encompasses the circle (like such):
and checking every point in that square to see whether it lands in the circle will probably TLE. (You have to check up to 2.5 billion points).
Can someone tell me what's wrong with my program
Try this case
The correct output should be
This comment is hidden due to too much negative feedback. Show it anyway.
Observe that there are 5 lattice points within the circle.
TLE for last two cases
This can be done in time.