averne_ 2 days ago

Self-plug, but I wrote an open-source NVDEC driver for the Tegra X1, working on both the Switch OS and NVidia's Linux distro (L4T): https://github.com/averne/FFmpeg.

It currently integrates all the low-level bits into FFmpeg directly, though I am looking at moving those to a separate library. Eventually, I hope to support desktop cards as well with minimal code changes.

mlsu 3 days ago

> Hardware accelerated video blocks often see plenty of reuse across a GPU lineup. I’m surprised that Nvidia spent extra engineering resources to make a completely different video engine for Tegra X1, rather than reusing the one in mainstream Maxwell GPUs. Perhaps Maxwell’s video engine was too power hungry to go into a mobile chip targeting 10W of power consumption.

It makes sense that the tested Tegra differs from a mainstream GTX 980. My understanding is that Tegra is basically a separate product line born from an early 2010s (somewhat) failed automotive/mobile play. I think those engineering resources were an already spent sunk cost in need of recouping. Maybe someone on the inside would know better.

  • rickdeckard 2 days ago

    I'm not aware of the automotive play until Tegra2 reached the end of its lifecycle as a Smartphone CPU. From what I know, Tegra had its inception as a CPU for portable electronics, in the era of ~2011 when Smartphone vendors sourced CPU's and Modems from different parties instead of having a single SoC including the Modem.

    Tegra2 was going head-to-head with TI OMAP3 in ~2012 for the fastest Premium Smartphone Platform, Tegra2 powered the first DualCore Smartphone in the world.

    At that time there were a few players on that market for CPU + Modem:

    - Qualcomm APQ + Qualcomm MDM Modem

    - TI OMAP + Infineon XMM Modem / Qualcomm MDM Modem

    - nVidia Tegra + Infineon XMM / Qualcomm MDM Modem

    - Samsung Exynos + Infineon XMM / Qualcomm MDM Modem

    - Apple/Samsung custom + Infineon XMM / Qualcomm MDM Modem

    In the end Qualcomm won against TI and nVidia by forging a path to merge APQ and MDM into their SD/SDM SoC-line and becoming "less economic" for Hardware manufacturers to have their Modems combined with other CPU's.

metadat 3 days ago

> Desktop Maxwell’s video engine undershot bitrate targets while the Tegra X1’s got closer. For example, the GTX 980 Ti averaged 12.13 mbps with a 15 mbps target, while the Tegra X1 averaged 14.16 mbps. Intel’s QuickSync on Skylake CPUs can also do HEVC encoding, and massively undershot bitrates. With the same requested 15 mbps, QuickSync averaged just 10.2 mbps.

I wonder how Nvidia 3XXXGT and 4XXXGT cards would perform at the HEVC encoding task.

resource_waste 2 days ago

Cool, but the reality is that if a Switch game looks good, its because the developers put in the effort to deal with the low performance GPU.

Any close-up of view a switch game looks a bit 'yikes'. I'm still a bit unforgiving of Gorons looking worse than in N64 era.

  • smcl 2 days ago

    This is about video encoding not about 3D performance…

  • lomase 9 hours ago

    So your are telling us you never played or even looked to a N64 game.

  • 0xfaded 2 days ago

    I played breath of the wild during covid and was just blown away by it, both as a work of art as well as technical brilliance. Knowing the switch's limitations, it felt like a AAA demoscene. I only learned later that the Wii was also a release platform!!!

    • jonhohle 2 days ago

      It was released on the Wii-U. The Wii-U arguably has a better GPU than the Switch due to different power constraints. However, the total CPU/GPU/RAM package is much better on the Switch. I also believe that BotW was developed for the Wii-U, which crashed so quickly that porting it to the Switch made sense.

      • mistyvales 2 days ago

        I played it on the Wii U and outside of some random slowdown (which honestly happened in the Switch version), I had a great time playing it and thought it still looked and played well for such an old system.

  • 3836293648 2 days ago

    Oh come on, Gorons do not look technically worse than in OoT/MM. You might not like the cell shaded characters but that is an artstyle and not a technical limitation and pretending it is is just disingenuous