zookeeper, you are partially right. First off, the RAZR is using a Lithium Ion Polymer Pouch Pack, which is NOT a smart battery. There is NO circuitry inside the battery to assist in the measurement of the battery's State of Charge, and only protection circuitry may exist to prevent deep discharge, and possibly also to prevent over-charge, as well as possibly thermal protection (though all of those things can be on-board in the phone instead, so there's no proof for or against that claim).
One thing is for sure, the Pouch Pack in our phones has only 2 (TWO) terminals, (+) & (-), and there are NO data terminals, so for any "information" that the battery could collect if it were a "smart battery", it would have no way to convey that information to the phone's charging system for processing. For our specific phones, the PHONE does ALL the charging and monitoring of the State of Charge for the battery.
Now to corroborate your claim, YES, coulomb counting is used, but it is used in combination with the voltage levels as well, so if the voltage levels are varying wildly, the result will vary as well. I used voltage as the talking point to keep it rather simple, but since you bring it up, here's the deep discussion.
From BatteryUniversity.com:
"Does the Battery Fuel Gauge Lie?
Why the Battery State-of-charge cannot be measured accurately
Measuring stored energy in an electrochemical device, such as a battery, is complex and state-of-charge (SoC) readings on a fuel gauge provide only a rough estimate. Users often compare battery SoC with the fuel gauge of a vehicle. Calculating fluid in a tank is simple because a liquid is a tangible entity; battery state-of-charge is not. Nor can the energy stored in a battery be quantified becauseprevailing conditions such as load current and operating temperature influence its release. A battery works best when warm; performance suffers when it is cold. In addition, a battery loses capacity through aging.
Current fuel gauge technologies are fraught with limitations and this came to light when users of the new iPad assumed that a 100 percent charge on the fuel gauge should also relate to a fully charged battery. This is not always so and users complained that the battery was only at 90 percent.
The modern fuel gauge used in iPads, smartphones and laptops read SoC through coulomb counting and voltage comparison. The complexity lies in managing these variables when the battery is in use. Applying a charge or discharge acts like a rubber band, pulling the voltage up or down, making a calculated SoC reading meaningless. In open circuit condition, as is the case when measuring a naked battery, a voltage reference may be used; however temperature and battery age will affect the reading. The open terminal voltage as a SoC reference is only reliable when including these environmental conditions and allowing the battery to rest for a few hours before the measurement.
In the case of the iPad, a 10 percent discrepancy between fuel gauge and true battery SoC is acceptable for consumer products. The accuracy will likely drop further with use, and depending on the effectiveness of a self-learning algorithm, battery aging can add another 20-30 percent to the error. By this time the user has gotten used to the quirks of the device and the oddity is mostly forgotten or accepted. While differences in the runtime cause only a mild inconvenience to a casual user, industrial applications, such as the electric powertrain in an electric vehicle, will need a better system. Improvements are in the work, and these developments may one day also benefit consumer products.
Coulomb counting is the heart of today’s fuel gauge. The theory goes back 250 years when Charles-Augustin de Coulomb first established the “Coulomb Rule.” It works on the principle of measuring in-and-out flowing currents. Coulomb counting also produces errors; the outflowing energy is always less than what goes in. Inefficiencies in charge acceptance, especially towards the end of charge, tracking errors, as well as losses during discharge and self-discharge while in storage contribute to this. Self-learning and periodic calibrations through a full charge/discharge assure an accuracy most can live with. "
For reference, voltage chart for the last 6 hours on my phone. Note: this is a discharge curve (if you can call it that), and NO charging was performed during this 6 hours - in fact not in the last 32 hours. The variances in voltages are while being used the voltages drop, and while at rest they recover.
In the example below, you can see the voltages started out about 6 hours ago at 3.74V, and while at rest managed to recover to 3.78V, then it was used for about 7.3 minutes during 4 calls during which you can see the voltage dropped to nearly 3.7V. Afterwards during rest it managed to recover to just over 3.78V, then I had three calls totaling 6.15 minutes during which it fell again to nearly 3.76V. Then it sat and rested for over an hour and recovered to nearly 3.81V, and during the next 1.5 hours of moderately light use it fell to 3.71V, and finally recovered again during rest in the last .25 hours to 3.74V.
THIS is the PRIMARY reason why our meters get out of sync with the actual battery SoC.
I rest my case.