Standard test needed to benchmark XRUNs

Optimize your system for ultimate performance.

Moderators: khz, MattKingUSA

windowsrefund
Established Member
Posts: 64
Joined: Mon Jul 30, 2018 11:04 pm

Standard test needed to benchmark XRUNs

Postby windowsrefund » Sat Dec 01, 2018 8:29 pm

Hello,

Still quite new to doing full-blown DAW via GNU/Linux and as we all know, there are plenty of knobs to tweak. I'd really like to come up with a standard test suite that attempts to generate XRUNs so I can track progress as I fine tune my system. I'm asking for opinions on what this test should be because I don't want to go through the tedious steps involved with launching a sequencer, opening a file, blah blah blah... I'm thinking along the lines of one of the phoronix test suites or something. Looking forward to hearing ideas.

Thanks in advance

User avatar
lilith
Established Member
Posts: 990
Joined: Fri May 27, 2016 11:41 pm
Location: bLACK fOREST
Contact:

Re: Standard test needed to benchmark XRUNs

Postby lilith » Sat Dec 01, 2018 8:36 pm

I had the same thought 5 seconds ago while discussing some xrun issues at IRC. So, if such a thing is technically possible it would be great.
Also, what does it mean when people mention latencies of 2ms. Maybe they can record one instrument at 2ms latency, but does that mean that a big session with lots of plugins also plays at 2ms without any xruns?
https://soundcloud.com/lilith_93

_____________________________
Debian 9 (XFCE) & KXStudio repos

windowsrefund
Established Member
Posts: 64
Joined: Mon Jul 30, 2018 11:04 pm

Re: Standard test needed to benchmark XRUNs

Postby windowsrefund » Sat Dec 01, 2018 8:52 pm

Right, system performance and xruns are relative to each user's workload. That's why I'm looking for a standard load to run (test, tweak, repeat) as it relates to A/V. I have a feeling it'd come down to one or more of the tests provided by http://phoronix-test-suite.com.

User avatar
khz
Established Member
Posts: 1135
Joined: Thu Apr 17, 2008 6:29 am
Location: German

Re: Standard test needed to benchmark XRUNs

Postby khz » Sat Dec 01, 2018 9:01 pm

lilith wrote:but does that mean that a big session with lots of plugins also plays at 2ms without any xruns?

All the software effects, instruments, channels (and mixing) must be calculated in real time. 10 - x Software synthesizers/effects require more computing power than an electric bass through hardware effects and software routing/recording.

# rt-tests >> https://packages.debian.org/de/stretch/rt-tests;
# cyclictest - High resolution test program >> https://manpages.debian.org/testing/rt-tests/cyclictest.8.en.html
# Using and Understanding the Real-Time Cyclictest Benchmark >> https://events.static.linuxfound.org/sites/events/files/slides/cyclictest.pdf
?
FZ - Does humor belongs in Music?
GNU/LINUX@AUDIO ~ /Wiki $ Howto.Info && GNU/Linux Debian installing >> Linux Audio Workstation LAW
    I don't care about the freedom of speech because I have nothing to say.

User avatar
lilith
Established Member
Posts: 990
Joined: Fri May 27, 2016 11:41 pm
Location: bLACK fOREST
Contact:

Re: Standard test needed to benchmark XRUNs

Postby lilith » Sat Dec 01, 2018 9:13 pm

I have a test running with the RT kernel and discussed the results with JackWinter at IRC. There is obviously no problem. My maximum latency is 50 microseconds even with 100% CPU usage.
https://soundcloud.com/lilith_93

_____________________________
Debian 9 (XFCE) & KXStudio repos

Jack Winter
Established Member
Posts: 376
Joined: Sun May 28, 2017 3:52 pm

Re: Standard test needed to benchmark XRUNs

Postby Jack Winter » Sat Dec 01, 2018 9:43 pm

lilith wrote:I have a test running with the RT kernel and discussed the results with JackWinter at IRC. There is obviously no problem. My maximum latency is 50 microseconds even with 100% CPU usage.


Then the RT kernel is doing it's job well.
Reaper/KDE/Archlinux. i7-2600k/16GB + i7-4700HQ/16GB, RME Multiface/Babyface, Behringer X32, WA273-EQ, 2 x WA-412, ADL-600, Tegeler TRC, etc 8) For REAPER on Linux information: https://wiki.cockos.com/wiki/index.php/REAPER_for_Linux

User avatar
CrocoDuck
Established Member
Posts: 1052
Joined: Sat May 05, 2012 6:12 pm
Contact:

Re: Standard test needed to benchmark XRUNs

Postby CrocoDuck » Sun Dec 02, 2018 11:08 am

windowsrefund wrote:Hello,

Still quite new to doing full-blown DAW via GNU/Linux and as we all know, there are plenty of knobs to tweak. I'd really like to come up with a standard test suite that attempts to generate XRUNs so I can track progress as I fine tune my system. I'm asking for opinions on what this test should be because I don't want to go through the tedious steps involved with launching a sequencer, opening a file, blah blah blah... I'm thinking along the lines of one of the phoronix test suites or something. Looking forward to hearing ideas.

Thanks in advance


I started thinking about this too some time ago. We have utilities as those mentioned by khz, but results from those are hard to correlate to userland audio performance. My latest idea was to use FAUST and do something like this:

  • Have some FAUST based DSP worker: a DSP algorithm with a known computational complexity. The FAUST compiler can be made to optimize the build for the target platform, which perhaps is appropriate to make a fair test of a platform.
  • Have a program that: sets up the audio stack (JACK, for example) witch certain buffering variables (Sample rate / buffer size, all that stuff).
  • The program then spawn DSP workers while logging all XRUNS.

This will give us the XRUNS frequency (XRUNS per unit time) as a function of:
  • DSP load.
  • JACK (or possibly ALSA?) settings.

Does it make any sense at all? The biggest con is that to make an exhaustive test (all combinations of JACK settings and reasonable DSP loads for those settings, I guess we can break out of the loop if we have 100 xruns per second) could easily take a lot of time. But one could only test certain samples rates, I guess...
Check my Linux audio experiments on my SoundCloud.
Browse my AUR packages.
Fancying a swim in the pond?

User avatar
khz
Established Member
Posts: 1135
Joined: Thu Apr 17, 2008 6:29 am
Location: German

Re: Standard test needed to benchmark XRUNs

Postby khz » Sun Dec 02, 2018 12:52 pm

In e.g. "qjackctl" settings > Advanced: "Server Prefix" select "Soft Mode".
I also have xruns depending on the audio/MIDI load, but I don't hear them. Does it also depend on the soundcard used?
Linux is talkative. No Panik!
FZ - Does humor belongs in Music?
GNU/LINUX@AUDIO ~ /Wiki $ Howto.Info && GNU/Linux Debian installing >> Linux Audio Workstation LAW
    I don't care about the freedom of speech because I have nothing to say.

merlyn
Established Member
Posts: 460
Joined: Thu Oct 11, 2018 4:13 pm

Re: Standard test needed to benchmark XRUNs

Postby merlyn » Tue Dec 11, 2018 1:53 pm

I think a standard test is a good idea. We then need a standard unit of Audio Based Computational Demand. I drew a graph of a simplified model.
DSPload.png


Relevant points that come out the graph are : the slope of the graph is inversely proportional to the power of the system being tested. The lines don't go through the origin, so outputting a stream of zeros takes some work.

To measure performance across different systems the x-axis needs units. A standard unit of computational demand.

Ideally there would be no Xruns below 100% DSP load. In mathematical speak it would be good if the relationship between Xruns and 100% DSP Load was 'If and only if". The significance of that is that 'if and only if' is a two way relationship. Xruns means 100% DSP Load and 100% DSP Load means Xruns.

In practice I have found that not to be the case, but I suspect that Xruns at low DSP loads tells us that there is something wrong with the configuration.
You do not have the required permissions to view the files attached to this post.

User avatar
khz
Established Member
Posts: 1135
Joined: Thu Apr 17, 2008 6:29 am
Location: German

Re: Standard test needed to benchmark XRUNs

Postby khz » Tue Dec 11, 2018 2:08 pm

Besides rt-tests https://git.kernel.org/pub/scm/utils/rt-tests/ for audio there is also test for MIDI: alsa-midi-latency-test https://github.com/koppi/alsa-midi-latency-test and jack_midi_latency https://github.com/x42/jack_midi_latency.
FZ - Does humor belongs in Music?
GNU/LINUX@AUDIO ~ /Wiki $ Howto.Info && GNU/Linux Debian installing >> Linux Audio Workstation LAW
    I don't care about the freedom of speech because I have nothing to say.

merlyn
Established Member
Posts: 460
Joined: Thu Oct 11, 2018 4:13 pm

Re: Standard test needed to benchmark XRUNs

Postby merlyn » Tue Dec 11, 2018 6:42 pm

khz wrote:Besides rt-tests https://git.kernel.org/pub/scm/utils/rt-tests/ for audio there is also test for MIDI: alsa-midi-latency-test https://github.com/koppi/alsa-midi-latency-test and jack_midi_latency https://github.com/x42/jack_midi_latency.


Thanks for the links. The proposed test has a different focus. The currently available tests make sure the foundations are solid to build an audio system on.

The proposed test would find out how much force is required to push the building over given solid foundations.

Jack Winter
Established Member
Posts: 376
Joined: Sun May 28, 2017 3:52 pm

Re: Standard test needed to benchmark XRUNs

Postby Jack Winter » Tue Dec 11, 2018 7:22 pm

IMO, the big question is what is an xrun, and what is dsp load.. :)

The basic issue is that calculating the audio has a deadline, if it's not finished on time (minus overhead) there is a dropout/glitch/xrun. The deadline can be calculate by buffersize / samplerate.

The dsp load is an average given by the jack server indicating how much of that deadline actually was used.

Xruns can happen at even low dsp load, if there is either a hardware issue causing the hardware to respond too late, or a software issue causing the program to be late in delivering it's result.

IMO, the first step ought to be to make it obvious to the user if the xrun is caused by hardware or software... :)

It would also be very useful to have a max latency metric from the jack server, and not only an average..

JACK2 can be built with the --profile option which can give a lot of interesting metrics, you can even get information as audio signals to record into your DAW for later perusal, see: http://www.grame.fr/ressources/publications/Timing.pdf
Reaper/KDE/Archlinux. i7-2600k/16GB + i7-4700HQ/16GB, RME Multiface/Babyface, Behringer X32, WA273-EQ, 2 x WA-412, ADL-600, Tegeler TRC, etc 8) For REAPER on Linux information: https://wiki.cockos.com/wiki/index.php/REAPER_for_Linux

windowsrefund
Established Member
Posts: 64
Joined: Mon Jul 30, 2018 11:04 pm

Re: Standard test needed to benchmark XRUNs

Postby windowsrefund » Thu Dec 13, 2018 9:08 pm

khz wrote:Besides rt-tests https://git.kernel.org/pub/scm/utils/rt-tests/ for audio there is also test for MIDI: alsa-midi-latency-test https://github.com/koppi/alsa-midi-latency-test and jack_midi_latency https://github.com/x42/jack_midi_latency.


I found these a few days ago but failed to get any value from them as each requires a loop created with the MIDI cables. In my case, this is a problem since I'm using one of those USB to MIDI IN & MIDI Out cables. In other words, I don't have the ability to take the MIDI IN and connect it to a MIDI OUT. Years ago, this would have been possible when I used a dedicated MIDI card with individual IN and OUT ports.

User avatar
khz
Established Member
Posts: 1135
Joined: Thu Apr 17, 2008 6:29 am
Location: German

Re: Standard test needed to benchmark XRUNs

Postby khz » Sat Dec 15, 2018 10:41 am

Yes these tests are for "old" real MIDI interfaces.
These "old" real MIDI interfaces, installed in a sound card / audio interface, send the data byte by byte.
USB MIDI sends the data block by block due to the USB technology. USB transmits the MIDI data (reliably?) over many layers.
This is a serious difference from a technical point of view.

####

Possibly helpful to the topic :
JACK Latency tests https://wiki.linuxaudio.org/wiki/jack_latency_tests
Latency: Myths and Facts. Part 3(2/1): A look at a quantitative study https://thecrocoduckspond.wordpress.com/2017/07/23/latency-myths-and-facts-part-3-a-look-at-a-quantitative-study/
FZ - Does humor belongs in Music?
GNU/LINUX@AUDIO ~ /Wiki $ Howto.Info && GNU/Linux Debian installing >> Linux Audio Workstation LAW
    I don't care about the freedom of speech because I have nothing to say.

merlyn
Established Member
Posts: 460
Joined: Thu Oct 11, 2018 4:13 pm

Re: Standard test needed to benchmark XRUNs

Postby merlyn » Thu Dec 20, 2018 12:42 am

I found an xrun counter that tramp wrote on this thread

xruncounter.c

Code: Select all

#include <stdio.h>
#include <errno.h>
#include <unistd.h>
#include <signal.h>
#include <stdlib.h>

#include <jack/jack.h>

/*   gcc -Wall xruncounter.c -lm `pkg-config --cflags --libs jack` -o xruncounter */

jack_client_t *client;


void
jack_shutdown (void *arg)
{
   exit (1);
}

int jack_xrun_callback(void *arg)
{
   /* count xruns */
   static int xruns = 0;
   xruns += 1;
   fprintf (stderr, "xrun %i \n", xruns);
   return 0;
}

void
signal_handler (int sig)
{
   jack_client_close (client);
   fprintf (stderr, " signal received, exiting ...\n");
   exit (0);
}

int
main (int argc, char *argv[])

{

   if ((client = jack_client_open ("xruncounter", JackNullOption, NULL)) == 0) {
      fprintf (stderr, "jack server not running?\n");
      return 1;
   }

   signal (SIGQUIT, signal_handler);
   signal (SIGTERM, signal_handler);
   signal (SIGHUP, signal_handler);
   signal (SIGINT, signal_handler);

   jack_set_xrun_callback(client, jack_xrun_callback, 0);
   jack_on_shutdown (client, jack_shutdown, 0);

   if (jack_activate (client)) {
      fprintf (stderr, "cannot activate client");
      return 1;
   }
   while (1) {
      usleep (100000);
   }

   jack_client_close (client);
   exit (0);
}

So that's a start on one half of the test. The other half is to generate load.
If we compare the proposed 'xrun benchmark' test to using the rt-tests combination of hackbench and cyclictest, they are similiar, but not identical. In rt-tests hackbench (the load generator) is simple and cyclictest (the measurement part) is complex and flexible. In xrun benchmark it would be the load generating part that has more options -- buffer size, sample rate and other things yet to be specified. The load generator can be thought on as equivalent to 'number of soft synths and plugins'


Return to “System Tuning and Configuration”

Who is online

Users browsing this forum: No registered users and 2 guests