Advertising (This ad goes away for registered users. You can Login or Register)

3D lib

Open discussions on programming specifically for the PS Vita.
Forum rules
Forum rule Nº 15 is strictly enforced in this subforum.
wonre
Posts: 32
Joined: Sat Mar 03, 2012 8:22 am

3D lib

Post by wonre » Wed Aug 10, 2016 8:03 am

Hi all.

I probably posted wrongly there (viewtopic.php?f=116&t=46428)
so I feel I better should post it here

I'm looking for a way to duplicate the very nice xerpi's vita2d lib into something able to add further set up step and function calls to allow gpu accelerated 3d on ps vita and ps tv, thru vitasdk dev environment.

Let's say I want to spy what a drm-less eboot.bin in a vpk calls to set up 3D right, is there yet any tool out compatible with henkaku to do that?

On the other hand, what tools might be suggested to try to detect sys calls made inside a drm-less eboot.bin (decrypt, disassemble, etc...

I suppose spying would be easier. Just like in the Nouveau project (which goal is to recreate directx/opengl+nvidia miniport behaviour)...
Advertising

romain337
Posts: 219
Joined: Wed Feb 29, 2012 10:42 am
Location: France
Contact:

Re: 3D lib

Post by romain337 » Wed Aug 10, 2016 8:28 am

It look lke all those new gen GPU driver. To learn how it work you don't really need to seek into the eboot at all. Just learn how to use new gen 3D api and when you'll come to the VITA GPU you'll have what you need to experiment.

Think about the fact that we have officialy no shader compiler and making 3D rendering with light and some fancy things need a more or less complicated shader. The rest is just sending a buffer like vita2d does already.
Advertising

wonre
Posts: 32
Joined: Sat Mar 03, 2012 8:22 am

Re: 3D lib

Post by wonre » Wed Aug 10, 2016 8:52 am

What I did with nv2a was to spy on the pseudo code of shader and compare with the one coming from nvidia public shader compilers. Doing that, I could supply translation functions to have the public shader compiler actually become useful on nv2a (took months). It was before the new era of vert+frag shaders though (vert+pix shaders)...
It worked like that in that old era :
shader src=>(public cg compiler)=>pseudo code=>(graphic api)=>binary gpu compatible code

So, being given the name of gpu, it will be useful to find other machines using it, and see if manufacturer published tools for it. If it's not exactly the same gpu as the ones existing for other machines, a little adaptation at pseudo code level might still be possible.

Need to learn from any code doing 3D first.

Spying how Tiles are setup is important, because without them you can't approach 100% gpu power while rendering 3D.
Last edited by wonre on Sat Aug 13, 2016 11:24 am, edited 1 time in total.

wonre
Posts: 32
Joined: Sat Mar 03, 2012 8:22 am

Re: 3D lib

Post by wonre » Wed Aug 10, 2016 9:18 am

I will try finding similarities between any vita stuff and the supported files extensions
.pfx .cl .frag .fsh .fx .gs .ps .vert .vs .vsh

by public powervr tools :
https://community.imgtec.com/downloads/ ... 2016-r1-2/

wonre
Posts: 32
Joined: Sat Mar 03, 2012 8:22 am

Re: 3D lib

Post by wonre » Wed Aug 10, 2016 4:12 pm

powervr site showcase refers to an open source shader compiler :
https://github.com/James-Jones/HLSLCrossCompiler

For nv2a I supported vs_1_1 fully and in a very minimalist way ps_1_1 pseudo code, to translate into nv2a gpu compatible binary code, (called nowadays bytecode it seems). bytecode generated by public domain nvidia cg compiler (cg language) if you specify vs_1_1 and ps_1_1 in shader source

For vita, shaders are much more evolved, so i can forget my previous work.

Readme says what languages and bytecode formats are supported.
Let's hope vita api's will be compatible with some (otherwise will have to upload gpu ourself and translate ourself bytecode) :

Supported bytecode formats:
cs_4_0 cs_4_1 cs_5_0
ds_5_0
hs_5_0
gs_4_0 gs_4_1 gs_5_0
ps_4_0 ps_4_0_level_9_1 ps_4_0_level_9_3 ps_4_0_level_9_0 ps_4_1 ps_5_0
vs_4_0_level_9_3 vs_4_0_level_9_0 vs_4_1 vs_5_0

Work is underway to support the DX9 bytecode formats:
ps_2_0 ps_2_a ps_2_b ps_3_0
vs_1_1 vs_2_0 vs_2_a vs_3_0

User avatar
progamer1515
Posts: 22
Joined: Tue Jan 14, 2014 2:23 am

Re: 3D lib

Post by progamer1515 » Thu Aug 11, 2016 2:09 am

IIRC the GPU uses the same architecture as that in the Sega Dreamcast. It is simply a more powerful quad core chip.

wonre
Posts: 32
Joined: Sat Mar 03, 2012 8:22 am

Re: 3D lib

Post by wonre » Fri Aug 12, 2016 7:49 am

Thanks for the tip. Sega Dreamcast gpu seems to be chip model CLX1. Haven't seen yet any shader reference on the dreamcastlive.net site and mc.pp.se/dc/sw.html page, so maybe it was fixed pipeline at this time. I even read first jap dreamcast were equipped of water cooling... amazing... However I'm a bit disgusted by the promotion made with the 'polygons per second' feature. It never takes in count the dma max transfert rate, and the minimal gouraud lighting shader a game really needs. Comparison charts said dreamcast=3 millions, ps2=20 millions and xbox original=300 millions, but in practice when you set up gouraud lighting, a texture, and a projection matrix and optimize speed by setting up the maximal size push buffer sent thru top speed dma transfert, the results are quite close like 250000 vertices per 60hz frame for ps2 (super optmized opcodes) and 330000 vertices per 60hz frame for xbox original (no strips which could multiply that by 3, and multiply per 20 to get polygons per second). So I bet dreamcast gpu was quite ok for its time. Advantage of xbox original was mainly due to Tiles.

About vita, chip model is SGX543, with modern shaders system.

I plan to extend a bit vita2dlib with the help of public domain shader compilers, and some dose of -on the fly- translation (like in pbkit.c which extends legally openxdk). No ETA since I have small free time each week but bookmark this thread and come back check progress each week/month. The first target will be to display a textured object with gouraud lighting (surfaces are smoother because of per pixel color calculation against face normal and light direction). Important point is to activate Tiles for optimized performance.

romain337
Posts: 219
Joined: Wed Feb 29, 2012 10:42 am
Location: France
Contact:

Re: 3D lib

Post by romain337 » Fri Aug 12, 2016 8:28 am

wonre wrote:maybe it was fixed pipeline at this time
IT WAS a fixed pipeline for sure.
wonre wrote:. I even read first jap dreamcast were equipped of water cooling
I think all Dreamcast was water cooled. My PAL EU Dremcast was watercooled.
wonre wrote: Important point is to activate Tiles for optimized performance.
What do you mean by "activate Tiles" ? Tiled rendering is done automatically, You don't have control oer it. You don't enable or disable it. If you talk about PowerVR Tiled rendering technique that is implemented in almost all every mobile gpu chip.

wonre
Posts: 32
Joined: Sat Mar 03, 2012 8:22 am

Re: 3D lib

Post by wonre » Fri Aug 12, 2016 12:20 pm

I don't know yet for vita, but nv2a required some work to have tiles correctly configured (probably because we could not, from openxdk just rely on a non existent directx api, so we had to redo what nvidia miniport driver was doing for setting up things, like a lot of dma channels set up).
If there is nothing to do, then it's good news. I will see later.

To be more precise about gouraud lighting, gouraud means a color (or altered color because of lighting) is calculated for each of the 3 vertices, then per pixel, some extrapolation is made to have smooth color variations all over the triangle. The formula involving the normal (vector directed in front of the triangle) and the light direction is not re-calculated for each pixel, in the test configuration I usually do to calculate the maximum number of vertices per 60hz frame. (also here is another platform I could test thanks to the work of tmbinc and tser published as 'xenkit' years ago : on hacked xbox360, 3.1 millions of vertices per 60hz frame)

User avatar
Z80
Posts: 123
Joined: Tue Apr 17, 2012 8:19 am
Location: [CPU]

Re: 3D lib

Post by Z80 » Fri Aug 12, 2016 2:12 pm

if you are really interested in this topic you should search internet for leaked sdk documentation specialy for .pdf called GPU-Users_Guide_e.pdf libgxm-Reference_e.pdf ... BUT ITS ILLEGALL AND DONT DO IT !!! :mrgreen:

Post Reply

Return to “Programming and Security”