Homebrew What palette number do i give to oamSet for extended palettes?

insanepotato

Well-Known Member
OP
Newcomer
Joined
Aug 1, 2009
Messages
53
Trophies
0
Location
broady
Website
numbat.cs.rmit.edu.au
XP
42
Country
Hello, I want to be able to load more OAM Sprites.

I've mapped one of the main VRAM blocks to sprite memory and banks F and G to Sprite Extended Palettes, DMA copied my pallet to the extended palette, and allocated with oamAllocate
I'm using 16color (4bit) images.
But what do i use for the "int palette_alpha" parameter of oamSet to use a palette in the extended palette area?
unsure.gif


Also, after loading a few sprites I'm getting NULL when i call oamAllocateGfx, despite having mapped one of the main VRAM banks to object memory opposed to using the smaller bank E.

this is how my VRAM BANKS are mapped btw
Code:
ÂÂÂÂvramSetBankA( VRAM_A_MAIN_BG );
ÂÂÂÂvramSetBankB( VRAM_B_MAIN_SPRITE );
ÂÂÂÂvramSetBankD( VRAM_D_MAIN_BG_0x06020000 );

ÂÂÂÂvramSetBankC( VRAM_C_SUB_BG );
ÂÂÂÂ
ÂÂÂÂvramSetBankF( VRAM_F_SPRITE_EXT_PALETTE );
ÂÂÂÂvramSetBankG( VRAM_G_SPRITE_EXT_PALETTE );

ÂÂÂÂvideoSetMode( MODE_5_2D |ÂÂDISPLAY_BG2_ACTIVE |ÂÂÂÂ DISPLAY_BG3_ACTIVE |ÂÂDISPLAY_SPR_ACTIVE |ÂÂDISPLAY_SPR_1D_LAYOUT );ÂÂÂÂ
ÂÂÂÂvideoSetModeSub( MODE_3_2D | DISPLAY_BG3_ACTIVE );

I init OAM with:
Code:
oamEnable( &oamMain );
oamInit( &oamMain, SpriteMapping_1D_32, true );

This is the relevent part of my sprite loading function
Code:
ÂÂÂÂÂÂ _oamGfx = oamAllocateGfx(&oamMain, _sSize, SpriteColorFormat_16Color );

ÂÂÂÂvramSetBankF(VRAM_F_LCD);

ÂÂÂÂdmaCopy(_gfx, _oamGfx, _gfxLen);
ÂÂÂÂdmaCopy(_pal, &VRAM_F_EXT_SPR_PALETTE[0], _palLen);

ÂÂÂÂvramSetBankF(VRAM_F_SPRITE_EXT_PALETTE);

So now what do i use for the palette_alpha paremter when i call
Code:
void ÂÂÂÂoamSet (OamState *oam, int id, int x, int y, int priority, int palette_alpha, SpriteSize size, SpriteColorFormat format, const void *gfxOffset, int affineIndex, bool sizeDouble, bool hide, bool hflip, bool vflip, bool mosaic)

Sorry if this is in the wrong section, but i couldn't seem to find a coding board, only the dstwo dev board >_
 

Dirbaio

Well-Known Member
Member
Joined
Sep 26, 2010
Messages
158
Trophies
0
Age
111
Location
Spain
Website
dirbaio.net
XP
108
Country
For the oamAllocateGfx returning NULL, yes, you're right, it is because the sprite vram gets full.

It's weird, though, because the main banks are quite big, so you should be able to fit in a lot of sprites. Aproximately, main banks are 128kb, so you can fit in there a 256x256 16-bit image, a 512x256 8-bit image, or a 512x512 4-bit image (you're using 4bit). Are you sure you're loading that much sprites? That's a lot.

If not, it's either a bug in libnds (unlikely), or you're leaking memory. Check that all the sprites you allocate are freed.

A nice thing you could try is looking at the VRAM using desmume's tile viewer (or the vram viewer of no$gba debug version if you have it). That way you'll be able to see if you're loading the same sprites multiple times.


Also (probably unrelated to your problem) you're allocating both VRAM banks F and G to sprite palettes. Only one is needed.


I've always found the oam allocation system confusing. Maybe you should consider not using it, and simply putting all your sprites in a single sprite sheet, and DMAing it directly to the OBJ VRAM, and use oamGetGfxPtr to create sprites. I did it for some small projects, and it's really simpler, but it's less flexible.


Finally, palette_alpha is the palette number for your object. In your example, it should be 0, since you're using the first palette. If you want to use more palettes, you can do this:

dmaCopy(_pal2, &VRAM_F_EXT_SPR_PALETTE[0][0], _pal1Len);
dmaCopy(_pal1, &VRAM_F_EXT_SPR_PALETTE[0][1], _pal2Len);

The first number is the slot number (sprites only have slot 0, BG's have 0..3 for each bg layer) and the second one is the palette number.


Last thing, if you're using sprite extended palettes, you have to enable them in videoSetMode using DISPLAY_SPR_EXT_PALETTE.

(Yay long post. Hope it helps.)
 

insanepotato

Well-Known Member
OP
Newcomer
Joined
Aug 1, 2009
Messages
53
Trophies
0
Location
broady
Website
numbat.cs.rmit.edu.au
XP
42
Country
QUOTE said:
so you can fit in there a 256x256 16-bit image
no.. im definitely not loading that many. In total, i have 64 images i want to load as sprites.

According to the code bellow, i can load 16 64x64 4bit Sprites. . . Is the oam table full?
Code:
ÂÂÂÂfor(int i = 0; true; i++){
ÂÂÂÂÂÂÂÂu16 *oamGfx = oamAllocateGfx(&oamMain, SpriteSize_64x64, SpriteColorFormat_16Color );
ÂÂÂÂÂÂÂÂif( !oamGfx){
ÂÂÂÂÂÂÂÂÂÂÂÂprintf("sid: %d\n", i);
ÂÂÂÂÂÂÂÂÂÂÂÂbreak;
ÂÂÂÂÂÂÂÂ}
ÂÂÂÂ}
ÂÂÂÂwhile(1){
ÂÂÂÂÂÂÂÂswiWaitForVBlank();
ÂÂÂÂ}

QUOTE said:
A nice thing you could try is looking at the VRAM using desmume's tile viewer (or the vram viewer of no$gba debug version if you have it). That way you'll be able to see if you're loading the same sprites multiple times.
desmume crashes when it loads my rom, displaying an error message "ext0"
unsure.gif


QUOTE
Finally, palette_alpha is the palette number for your object.
I knew this was the case for normal palettes, didn't know if i needed something special after the palette i need is no longer in the normal 1kb sprite palette ram =P

and I'll unmap VRAM G.

Thanks for the reply =P
 

Dirbaio

Well-Known Member
Member
Joined
Sep 26, 2010
Messages
158
Trophies
0
Age
111
Location
Spain
Website
dirbaio.net
XP
108
Country
That's weird. If I've calculated it right, VRAM shouldn't get full with only 16 sprites:

64*64*16 = 65536 pixels.
65536/2 (since you're using 4bpp) = 32728 bytes = 32k.

Maybe libnds is doing some weird padding? I have no idea.
Try this: Tell it to print the addresses of each sprite you alloc:

iprintf("%x\n", (u32)oamGfx);

If libnds is doing it right, each sprite you alloc should be 64*64/2 = 2048 = 0x800 bytes further.


Also, what version of desmume are you using? One of the recent libnds updates breaks in the old "stable" release.
Get yourself a more recent desmume svn version and try again
smile.gif
 

insanepotato

Well-Known Member
OP
Newcomer
Joined
Aug 1, 2009
Messages
53
Trophies
0
Location
broady
Website
numbat.cs.rmit.edu.au
XP
42
Country
I got DeSmuME, which worked fine, turns out it was Dualis that was crashing o.o

It seems to be allocating at the right addresses, but it still only allocates 16 sprites
Code:
6400000
6400800
6401000
6401800
6402000
6402800
6403000
6403800
6404000
6404800
6405000
6405800
6406000
6406800
6407000
6407800

I'm looking at the Memory Map at http://dev-scene.com/NDS/Tutorials_Day_2
QUOTE said:
Virtual Video RAM ............Start.............. Stop ........Size
Main Background.......0x06000000......0x0607FFFF.......512KB
Sub Background........0x06200000......0x0621FFFF.......128KB
Main Sprite.............0x06400000....0x0643FFFF.....256KB
Sub Sprite................0x06600000......0x0661FFFF.......128KB
(sorry, had to use dots, the spaces wouldn't display)

According to this information, there should still be alot of space left.

I tried changing the sprite size in the loop, regardless of the parameters, oamAllocate only ever seems to allocate up to 0x06408000.

Which is really frustrating, everything from 0x06408000~0x0643FFFF is unsuable! >
 

chintoi

Well-Known Member
Newcomer
Joined
Oct 9, 2008
Messages
51
Trophies
0
Website
Visit site
XP
65
Country
Serbia, Republic of
I think you should really get 6400000 - 6420000, the rest belongs to the second slot whatever that should mean

VRAM_B_MAIN_SPRITE = 2,
VRAM_B_MAIN_SPRITE_0x06400000 = 2 | (( 0 )
 

chintoi

Well-Known Member
Newcomer
Joined
Oct 9, 2008
Messages
51
Trophies
0
Website
Visit site
XP
65
Country
Serbia, Republic of
Since libnds's allocator seems to be broken and the address space of the bank is known and each sprite's size is also known, I'd try to write my own allocator - seems easy lol))
 

insanepotato

Well-Known Member
OP
Newcomer
Joined
Aug 1, 2009
Messages
53
Trophies
0
Location
broady
Website
numbat.cs.rmit.edu.au
XP
42
Country
OK, after messing with the examples provided with devkitPro and peeking at libnds's source I've discovered:

In order for a sprite to use extended palettes, the last parameter of oamInit must be set to true.
This will make it set DISPLAY_SPR_ACTIVE and DISPLAY_SPR_EXT_PALETTE on the REG_DISPCNT register. (in libnds's sprite.c)

AND oamSet must be called with SpriteColorFormat_256Color. Any other colour format sets it to use obj pal.

If oamSet is called with SpriteColorFormat_256Color but the graphic is SpriteColorFormat_16Color, it displays all muddled up. This is useless for my cause >_<

Still dont know why oamAllocatrGfx only allocates 16 64x64 sprites...
I have no idea how write a new allocator.. Had a look at libnds's sprite_alloc.c, way over my head.

This is all so frustrating >~
 

chintoi

Well-Known Member
Newcomer
Joined
Oct 9, 2008
Messages
51
Trophies
0
Website
Visit site
XP
65
Country
Serbia, Republic of
It was partially a joke. That memory is specifically dedicated for sprites and thus cannot be used for anything else but for, erm, sprites allocated with default allocator. Suppose we throw it off and do this
_oamGfx = (u16*)(0x6407800 + (0x6407800 - 0x6407000));
And test if we can use this pointer normally. It's not like anything really bad is going to happen)
 

Dirbaio

Well-Known Member
Member
Joined
Sep 26, 2010
Messages
158
Trophies
0
Age
111
Location
Spain
Website
dirbaio.net
XP
108
Country
Yeah, that's the "easy" approach.

Just copy your sprites manually to VRAM, and use pointers manually. It's not that hard.


Say you want to load 30 64x46 sprites? Each sprite is 0x800 long, so:

CODEfor(int i = 0; i < 30; i++)
{
ÂÂu16* oamGfx = (u16*) (0x06200000 + i*0x800);
ÂÂdmaCopy(_gfx, oamGfx, _gfxLen);
}

Also, if you want to use extended palettes, then you're forced to use 8bpp graphics. See http://nocash.emubase.de/gbatek.htm#dsvideoextendedpalettes
 

insanepotato

Well-Known Member
OP
Newcomer
Joined
Aug 1, 2009
Messages
53
Trophies
0
Location
broady
Website
numbat.cs.rmit.edu.au
XP
42
Country
chintoi, Dirbaio, thanks, I think the approach you mentioned is what I'll use, if i can get it all working fine.

As current, I've loaded a few sprites using this method so far it it seems alright.
Since it forces 256color and I'm using 16, i thought I'd be clever and only load the first 16 colors of the palete, then use spriteIndex/(EXT_PALETTE_SIZE/EXT_PALETTE_LEN) as the index for oam set (hope my maths is right there).

I'm readjusting the colors in the Tile data with
CODEÂÂÂÂÂÂÂÂÂÂÂÂu8*temp = (u8*)malloc(_gfxLen);
ÂÂÂÂÂÂÂÂÂÂÂÂmemcpy(temp, _gfx, _gfxLen);
ÂÂÂÂÂÂÂÂÂÂÂÂ//ÂÂÂÂAdjust palette offset
ÂÂÂÂÂÂÂÂÂÂÂÂfor(u16 i = 0; i < _gfxLen; ++i){
ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂif( temp == 0 ) continue;
ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂtemp += ((_spriteIndex%EXT_PALETTE_COUNT)*EXT_PALETTE_LEN);
ÂÂÂÂÂÂÂÂÂÂÂÂ}
if this doesn't work, i'll just deallocate animation frames that aren't shown and reallocate them when i need them

relminator said:
Code:
oamInit( &oamMain, SpriteMapping_1D_32, true );
I would hazard a guess that this is your problem.
Na, i was using true for that last parameter, the problem was i was using SpriteColorFormat_16Color =)


Thank you everyone for the help, it's very much appreciated ^^
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
    Psionic Roshambo @ Psionic Roshambo: Jizzed on by a radioactive porn star!