Charging Myths

Talaria

...
OP
Member
Joined
Jan 31, 2007
Messages
584
Trophies
0
Location
...
Website
Visit site
XP
259
Country
New Zealand
I am about to buy a new cellphone and have heard numerous myths about how to charge it so you get the maximum battery life for your phone. So Im abit perplexed to whether there is any basis behind them. The most common one I hear is;

Charge the battery for about 24hrs for the first time and then let the battery go completely dead and then recharge it over the course of the first few weeks of using the new phone to get the optimum battery life.

I asked the store where I'm getting it from for a reasonable price but they don't know jack what there talking about, suggestions?
 

xalphax

Internet killed the Ponystar.
Member
Joined
Nov 18, 2006
Messages
1,298
Trophies
1
Age
28
Location
here'n'there
Website
Visit site
XP
1,204
Country
Croatia
QUOTE said:
Overcharging a Li-poly battery will likely result in explosion and/or fire. During discharge on load, the load has to be removed as soon as the voltage drops below approximately 3.0 V per cell (used in a series combination), or else the battery will subsequently no longer accept a full charge and may experience problems holding voltage under load.

http://en.wikipedia.org/wiki/Lithium-ion_polymer_battery

most cellphone batts are lithium-ion-polymer.
 

Mangofett

GBAtemp Testing Area
Member
Joined
May 14, 2006
Messages
4,885
Trophies
1
Age
19
XP
1,059
Country
United States
QUOTE said:
Overcharging a Li-poly battery will likely result in explosion and/or fire. During discharge on load, the load has to be removed as soon as the voltage drops below approximately 3.0 V per cell (used in a series combination), or else the battery will subsequently no longer accept a full charge and may experience problems holding voltage under load.

http://en.wikipedia.org/wiki/Lithium-ion_polymer_battery

most cellphone batts are lithium-ion-polymer.
That statement is false, unless you're talking stone age Lithiums. Today, Lithiums have voltage meters that cut off attempts to overcharge.

Also, letting the battery drop dead does nothing significant anymore. That would be called a deep-discharge cycle, but again, technology in the battery prevents you from doing a deep discharge.

However, keeping it at full charge all the time isn't good for the battery. Just use it, and recharge as needed. Its a cellphone, you could always get a new battery if you absolutely have to. I've never needed to replace a cellphone battery.
 

xalphax

Internet killed the Ponystar.
Member
Joined
Nov 18, 2006
Messages
1,298
Trophies
1
Age
28
Location
here'n'there
Website
Visit site
XP
1,204
Country
Croatia
ah, oh well
biggrin.gif


my bad. but still, i experienced that if i leave my cellphone charging too long it gets warm and i dont think thats a good sign.
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
  • The Real Jdbye @ The Real Jdbye:
    the vram is one advantage when it comes to AI but ends up being slower even with that and really AI is the only use case that needs more than 12gb vram right now
  • Psionic Roshambo @ Psionic Roshambo:
    Interesting lol
  • Psionic Roshambo @ Psionic Roshambo:
    I think I watched a video where two games at 4K where eating just over 16GB of RAM and it's the one case where the 7900XT and XTX pulled ahead (minus RTX of course)
  • Psionic Roshambo @ Psionic Roshambo:
    So my opinion is that they could age a bit better in the future, and maybe AMD will continue improving them via drivers like they tend to do. No guarantee there but they have done it in the past. Just a feeling I have.
  • The Real Jdbye @ The Real Jdbye:
    cyberpunk at 4k without DLSS/fidelityfx *might* exceed 12gb
    +1
  • The Real Jdbye @ The Real Jdbye:
    but that game barely runs at native 4k
  • Psionic Roshambo @ Psionic Roshambo:
    I think it was some newer games and probably poorly optimized PS4 or PS5 ports
  • The Real Jdbye @ The Real Jdbye:
    they definitely will age better but i feel dlss might outweigh that since it looks about as good as native resolution and much less demanding
    +1
  • Psionic Roshambo @ Psionic Roshambo:
    When I played Cyberpunk on my old 2080 Ti it sucked lol
  • The Real Jdbye @ The Real Jdbye:
    AMD could introduce something comparable to DLSS but nvidia's got a lot more experience with that
  • The Real Jdbye @ The Real Jdbye:
    least amd 7xxx has tensor cores which the previous generations didn't so there is the potential for AI upscaling
  • Psionic Roshambo @ Psionic Roshambo:
    They have FSR or whatever it's called and yeah it's still not great
  • The Real Jdbye @ The Real Jdbye:
    so AMD seem to finally be starting to take AI seriously
  • Psionic Roshambo @ Psionic Roshambo:
    Oh yeah those new 8000 CPUs have AI cores built in that's interesting
  • Psionic Roshambo @ Psionic Roshambo:
    Maybe they plan on offloading to the CPU?
  • Psionic Roshambo @ Psionic Roshambo:
    Would be kinda cool to have the CPU and GPU working in random more
  • Psionic Roshambo @ Psionic Roshambo:
    Tandem even
  • The Real Jdbye @ The Real Jdbye:
    i think i heard of that, it's a good idea, shouldn't need a dedicated GPU just to run a LLM or video upscaling
  • The Real Jdbye @ The Real Jdbye:
    even the nvidia shield tv has AI video upscaling
  • The Real Jdbye @ The Real Jdbye:
    LLMs can be run on cpu anyway but it's quite slow
  • BakerMan @ BakerMan:
    Have you ever been beaten by a wet spaghetti noodle by your girlfriend because she has a twin sister, and you got confused and fucked her dad?
  • Psionic Roshambo @ Psionic Roshambo:
    I had a girlfriend who had a twin sister and they would mess with me constantly.... Until one chipped a tooth then finally I could tell them apart.... Lol
  • Psionic Roshambo @ Psionic Roshambo:
    They would have the same hair style the same clothes everything... Really messed with my head lol
  • Psionic Roshambo @ Psionic Roshambo:
    @The Real Jdbye, I could see AMD trying to pull off the CPU GPU tandem thing, would be a way to maybe close the gap a bit with Nvidia. Plus it would kinda put Nvidia at a future disadvantage since Nvidia can't make X86/64 CPUs? Intel and AMD licensing issues... I wonder how much that has held back innovation.
    Psionic Roshambo @ Psionic Roshambo: @The Real Jdbye, I could see AMD trying to pull off the CPU GPU tandem thing, would be a way to...