A refresh request is in progress, please wait...

Queue position: None

app-benchmarks/stress-ng

Description
A tool to load and stress a computer system
Homepage
http://kernel.ubuntu.com/~cking/stress-ng/
Maintainers

Conrad Kostecki <conikost@gentoo.org>

Q&A
Version history
  • gentoo - app-benchmarks/stress-ng-0.05.12 [gentoo] - July 7, 2017, 5:04 a.m.
  • gentoo - app-benchmarks/stress-ng-0.06.00 [gentoo] - July 7, 2017, 5:03 a.m.
  • gentoo + app-benchmarks/stress-ng-0.06.00 [gentoo] - May 9, 2016, 5:03 a.m.
  • gentoo + app-benchmarks/stress-ng-0.05.12 [gentoo] - Feb. 5, 2016, 4:03 a.m.
euscan log

Date: Aug. 12, 2020, 6:01 a.m.

 * SRC_URI is 'https://kernel.ubuntu.com/~cking/tarballs/stress-ng/stress-ng-0.11.10.tar.xz'
 * Scanning: https://kernel.ubuntu.com/~cking/tarballs/stress-ng/stress-ng-${PV}.tar.xz
 * Scanning: https://kernel.ubuntu.com/~cking/tarballs/stress-ng
 * Url 'https://kernel.ubuntu.com/~cking/tarballs/stress-ng' blocked by robots.txt
 * Generating version from 0.11.10
 * Brute forcing: https://kernel.ubuntu.com/~cking/tarballs/stress-ng/stress-ng-${PV}.tar.xz
 * Url 'https://kernel.ubuntu.com/~cking/tarballs/stress-ng/stress-ng-0.11.11.tar.xz' blocked by robots.txt
 * Url 'https://kernel.ubuntu.com/~cking/tarballs/stress-ng/stress-ng-0.11.12.tar.xz' blocked by robots.txt
 * Url 'https://kernel.ubuntu.com/~cking/tarballs/stress-ng/stress-ng-0.11.13.tar.xz' blocked by robots.txt
 * Url 'https://kernel.ubuntu.com/~cking/tarballs/stress-ng/stress-ng-0.12.0.tar.xz' blocked by robots.txt
 * Url 'https://kernel.ubuntu.com/~cking/tarballs/stress-ng/stress-ng-0.13.0.tar.xz' blocked by robots.txt
 * Url 'https://kernel.ubuntu.com/~cking/tarballs/stress-ng/stress-ng-0.14.0.tar.xz' blocked by robots.txt
 * Url 'https://kernel.ubuntu.com/~cking/tarballs/stress-ng/stress-ng-1.0.0.tar.xz' blocked by robots.txt
 * Url 'https://kernel.ubuntu.com/~cking/tarballs/stress-ng/stress-ng-2.0.0.tar.xz' blocked by robots.txt
 * Url 'https://kernel.ubuntu.com/~cking/tarballs/stress-ng/stress-ng-3.0.0.tar.xz' blocked by robots.txt