Hardware Switch no charging but can work with fully battery

khalkedon

Active Member
OP
Newcomer
Joined
Aug 15, 2007
Messages
32
Trophies
0
XP
351
Country
Hi people.

I have replaced the broken Type-C USB port but after that no any charging indicator. When I check with USB tester, it shows 0 amper.

I pluged a fully good battery to Nintendo Switch, it works perfectly. I can navigate all the manu and play games. But When I try to charge no any response, I can't see any charging logo. Even I replaced the BQ24193 charging IC but no any result.

I need your helps. Thank you in advance.
 

SheriffBuck

Well-Known Member
Newcomer
Joined
Jan 6, 2020
Messages
98
Trophies
0
Location
Hampton, London
XP
421
Country
United Kingdom
OK.

Can you check if anything happens when you try both orientations on the USB C connector?

Check that you are getting charger voltage VB on pin 9 if the M92 and similar output VEX to the BQ on pin 28 of the M92.

Check internal power selector status VCCIN on pin 5 of M92, and VSVR system 3.3v rail on pin 6.

Measure VSYS pin 15/16 on BQ.

Next step is to check the CC lines which run via an ESD protection device on the underside of the board near the USB-C. These can get damaged causing the lines to short to ground. Buzz then through to the M92. You will probably need a breakout board on the USB C to test this reliably. There are test pads on the underside too. Again search the forum for a picture I posted. If you have continuity problems, this IC can be removed to aid diagnosis.

Let me know how you get on with the above.

Sheriff





Sent from my SM-G975F using Tapatalk
 
  • Like
Reactions: khalkedon

khalkedon

Active Member
OP
Newcomer
Joined
Aug 15, 2007
Messages
32
Trophies
0
XP
351
Country
OK.

Can you check if anything happens when you try both orientations on the USB C connector?

Check that you are getting charger voltage VB on pin 9 if the M92 and similar output VEX to the BQ on pin 28 of the M92.

Check internal power selector status VCCIN on pin 5 of M92, and VSVR system 3.3v rail on pin 6.

Measure VSYS pin 15/16 on BQ.

Next step is to check the CC lines which run via an ESD protection device on the underside of the board near the USB-C. These can get damaged causing the lines to short to ground. Buzz then through to the M92. You will probably need a breakout board on the USB C to test this reliably. There are test pads on the underside too. Again search the forum for a picture I posted. If you have continuity problems, this IC can be removed to aid diagnosis.

Let me know how you get on with the above.

Sheriff





Sent from my SM-G975F using Tapatalk

Thank you to your helps. I solved the problem. The source of the problem is M92T36 chip, I replaced it and happy ending :)
Thanks a lots friend.
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
  • No one is chatting at the moment.
  • The Real Jdbye @ The Real Jdbye:
    the vram is one advantage when it comes to AI but ends up being slower even with that and really AI is the only use case that needs more than 12gb vram right now
  • Psionic Roshambo @ Psionic Roshambo:
    Interesting lol
  • Psionic Roshambo @ Psionic Roshambo:
    I think I watched a video where two games at 4K where eating just over 16GB of RAM and it's the one case where the 7900XT and XTX pulled ahead (minus RTX of course)
  • Psionic Roshambo @ Psionic Roshambo:
    So my opinion is that they could age a bit better in the future, and maybe AMD will continue improving them via drivers like they tend to do. No guarantee there but they have done it in the past. Just a feeling I have.
  • The Real Jdbye @ The Real Jdbye:
    cyberpunk at 4k without DLSS/fidelityfx *might* exceed 12gb
    +1
  • The Real Jdbye @ The Real Jdbye:
    but that game barely runs at native 4k
  • Psionic Roshambo @ Psionic Roshambo:
    I think it was some newer games and probably poorly optimized PS4 or PS5 ports
  • The Real Jdbye @ The Real Jdbye:
    they definitely will age better but i feel dlss might outweigh that since it looks about as good as native resolution and much less demanding
    +1
  • Psionic Roshambo @ Psionic Roshambo:
    When I played Cyberpunk on my old 2080 Ti it sucked lol
  • The Real Jdbye @ The Real Jdbye:
    AMD could introduce something comparable to DLSS but nvidia's got a lot more experience with that
  • The Real Jdbye @ The Real Jdbye:
    least amd 7xxx has tensor cores which the previous generations didn't so there is the potential for AI upscaling
  • Psionic Roshambo @ Psionic Roshambo:
    They have FSR or whatever it's called and yeah it's still not great
  • The Real Jdbye @ The Real Jdbye:
    so AMD seem to finally be starting to take AI seriously
  • Psionic Roshambo @ Psionic Roshambo:
    Oh yeah those new 8000 CPUs have AI cores built in that's interesting
  • Psionic Roshambo @ Psionic Roshambo:
    Maybe they plan on offloading to the CPU?
  • Psionic Roshambo @ Psionic Roshambo:
    Would be kinda cool to have the CPU and GPU working in random more
  • Psionic Roshambo @ Psionic Roshambo:
    Tandem even
  • The Real Jdbye @ The Real Jdbye:
    i think i heard of that, it's a good idea, shouldn't need a dedicated GPU just to run a LLM or video upscaling
  • The Real Jdbye @ The Real Jdbye:
    even the nvidia shield tv has AI video upscaling
  • The Real Jdbye @ The Real Jdbye:
    LLMs can be run on cpu anyway but it's quite slow
  • BakerMan @ BakerMan:
    Have you ever been beaten by a wet spaghetti noodle by your girlfriend because she has a twin sister, and you got confused and fucked her dad?
  • Psionic Roshambo @ Psionic Roshambo:
    I had a girlfriend who had a twin sister and they would mess with me constantly.... Until one chipped a tooth then finally I could tell them apart.... Lol
  • Psionic Roshambo @ Psionic Roshambo:
    They would have the same hair style the same clothes everything... Really messed with my head lol
  • Psionic Roshambo @ Psionic Roshambo:
    @The Real Jdbye, I could see AMD trying to pull off the CPU GPU tandem thing, would be a way to maybe close the gap a bit with Nvidia. Plus it would kinda put Nvidia at a future disadvantage since Nvidia can't make X86/64 CPUs? Intel and AMD licensing issues... I wonder how much that has held back innovation.
    Psionic Roshambo @ Psionic Roshambo: @The Real Jdbye, I could see AMD trying to pull off the CPU GPU tandem thing, would be a way to...