You have an array of integers, initialized all to zero to begin with. Support two operations.
INCREMENT l r a
- For each index between and , increase the th element of the array by .
SUM l r
- Compute the sum of the integers between indices and , inclusive.
Constraints
Input Specification
The first line contains two positive integers, and .
The next lines each contain a sequence of positive integers. If the first integer in the line is , then three integers follow, , , and , indicating an INCREMENT
operation.
Otherwise, the first integer in the line is , and then two integers follow, and , indicating a SUM
operation.
Output Specification
For each SUM
operation, output on its own line the result of the query.
Sample Input
3 4
1 2 3 2
2 1 1
2 2 2
2 3 3
Sample Output
0
2
4
Comments
I am assuming the array starts at index 0?
so no, the array starts at index 1.
Can anyone tell me why I'm TLEing on testcase 5? I know there's some sort of optimization but I don't know the specifics.
Your time complexity is currently , which is insufficient given the constraints (this is on the order of operations, give or take). If you are stumped, you can try reading the tutorial :)).