Hacking Battery Life with Slot 1 Soln

MizuRyuu

New Member
OP
Newbie
Joined
Nov 17, 2006
Messages
1
Trophies
0
XP
20
Country
Canada
like the topic says, i am just wondering what is the battery life using a slot 1 flashcart instead of a slot 2 one..... i know that using a slot 2 would decrease battery life by half.... just wondering what happen if i use a slot 1 instead...
 

jonsnow7412

Member
Newcomer
Joined
Nov 2, 2006
Messages
6
Trophies
0
XP
8
Country
United States
I'm using the ds-link which is a slot 1 and I have seen no difference in battery life. Mine lasts I think 8 - 10 hours playing DS roms. I wish it had gba support
 

Tir

Well-Known Member
Member
Joined
Jun 7, 2006
Messages
154
Trophies
1
Website
Visit site
XP
239
Country
Reports suggest DS-X's battery life is a fair bit worse than G6 Lite's, so a card won't have great life just because it's slot 1.
 

cheeo

Well-Known Member
Newcomer
Joined
Dec 3, 2006
Messages
99
Trophies
0
XP
63
Country
United States
And for the longest time people have been saying that when the slot1 carts come out they’ll get about the same battery life as an original DS cart would get. Go’s to show you can’t believe everything you read.
biggrin.gif
 

nerd1

Well-Known Member
Newcomer
Joined
Jun 18, 2006
Messages
54
Trophies
1
Website
Visit site
XP
242
Country
i acutally think my ds-x uses less power than my superkey/supercard-minisd solution. over the weekend after playing with my ds for a while I put it into hibernation and went to bed. And in the morning the battery was still green and didnt need a recharge ... in the past I remember the DS lite will usually be switched off by the morning with my superkey/supercard-minisd combo.
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
  • The Real Jdbye @ The Real Jdbye:
    both of which work well in potplayer
  • The Real Jdbye @ The Real Jdbye:
    amd is a bit cheaper though
  • Psionic Roshambo @ Psionic Roshambo:
    Cheaper and they are less stingy with the RAM not a big issue now but I can imagine in a yard or two things might be different
  • Psionic Roshambo @ Psionic Roshambo:
    Year not yard lol
  • The Real Jdbye @ The Real Jdbye:
    the vram is one advantage when it comes to AI but ends up being slower even with that and really AI is the only use case that needs more than 12gb vram right now
  • Psionic Roshambo @ Psionic Roshambo:
    Interesting lol
  • Psionic Roshambo @ Psionic Roshambo:
    I think I watched a video where two games at 4K where eating just over 16GB of RAM and it's the one case where the 7900XT and XTX pulled ahead (minus RTX of course)
  • Psionic Roshambo @ Psionic Roshambo:
    So my opinion is that they could age a bit better in the future, and maybe AMD will continue improving them via drivers like they tend to do. No guarantee there but they have done it in the past. Just a feeling I have.
  • The Real Jdbye @ The Real Jdbye:
    cyberpunk at 4k without DLSS/fidelityfx *might* exceed 12gb
    +1
  • The Real Jdbye @ The Real Jdbye:
    but that game barely runs at native 4k
  • Psionic Roshambo @ Psionic Roshambo:
    I think it was some newer games and probably poorly optimized PS4 or PS5 ports
  • The Real Jdbye @ The Real Jdbye:
    they definitely will age better but i feel dlss might outweigh that since it looks about as good as native resolution and much less demanding
    +1
  • Psionic Roshambo @ Psionic Roshambo:
    When I played Cyberpunk on my old 2080 Ti it sucked lol
  • The Real Jdbye @ The Real Jdbye:
    AMD could introduce something comparable to DLSS but nvidia's got a lot more experience with that
  • The Real Jdbye @ The Real Jdbye:
    least amd 7xxx has tensor cores which the previous generations didn't so there is the potential for AI upscaling
  • Psionic Roshambo @ Psionic Roshambo:
    They have FSR or whatever it's called and yeah it's still not great
  • The Real Jdbye @ The Real Jdbye:
    so AMD seem to finally be starting to take AI seriously
  • Psionic Roshambo @ Psionic Roshambo:
    Oh yeah those new 8000 CPUs have AI cores built in that's interesting
  • Psionic Roshambo @ Psionic Roshambo:
    Maybe they plan on offloading to the CPU?
  • Psionic Roshambo @ Psionic Roshambo:
    Would be kinda cool to have the CPU and GPU working in random more
  • Psionic Roshambo @ Psionic Roshambo:
    Tandem even
  • The Real Jdbye @ The Real Jdbye:
    i think i heard of that, it's a good idea, shouldn't need a dedicated GPU just to run a LLM or video upscaling
  • The Real Jdbye @ The Real Jdbye:
    even the nvidia shield tv has AI video upscaling
  • The Real Jdbye @ The Real Jdbye:
    LLMs can be run on cpu anyway but it's quite slow
    The Real Jdbye @ The Real Jdbye: LLMs can be run on cpu anyway but it's quite slow