mediawiki/extensions/Wikispeech: main (log #957009)

sourcepatches

This run took 43 seconds.

$ date
--- stdout ---
Fri Mar 17 06:13:36 UTC 2023

--- end ---
$ git clone file:///srv/git/mediawiki-extensions-Wikispeech.git repo --depth=1 -b master
--- stderr ---
Cloning into 'repo'...
--- stdout ---

--- end ---
$ git config user.name libraryupgrader
--- stdout ---

--- end ---
$ git config user.email tools.libraryupgrader@tools.wmflabs.org
--- stdout ---

--- end ---
$ git submodule update --init
--- stdout ---

--- end ---
$ grr init
--- stdout ---
Installed commit-msg hook.

--- end ---
$ git show-ref refs/heads/master
--- stdout ---
a1894449ee4e82ec634c532ec56e6c7022cb9cbb refs/heads/master

--- end ---
$ /usr/bin/npm audit --json --legacy-peer-deps
--- stdout ---
{
  "auditReportVersion": 2,
  "vulnerabilities": {},
  "metadata": {
    "vulnerabilities": {
      "info": 0,
      "low": 0,
      "moderate": 0,
      "high": 0,
      "critical": 0,
      "total": 0
    },
    "dependencies": {
      "prod": 1,
      "dev": 410,
      "optional": 0,
      "peer": 0,
      "peerOptional": 0,
      "total": 410
    }
  }
}

--- end ---
$ /usr/bin/composer install
--- stderr ---
No lock file found. Updating dependencies instead of installing from lock file. Use composer update over composer install if you do not have a lock file.
Loading composer repositories with package information
Info from https://repo.packagist.org: #StandWithUkraine
Updating dependencies
Lock file operations: 34 installs, 0 updates, 0 removals
  - Locking composer/pcre (3.1.0)
  - Locking composer/semver (3.3.2)
  - Locking composer/spdx-licenses (1.5.7)
  - Locking composer/xdebug-handler (3.0.3)
  - Locking felixfbecker/advanced-json-rpc (v3.2.1)
  - Locking mediawiki/mediawiki-codesniffer (v41.0.0)
  - Locking mediawiki/mediawiki-phan-config (0.12.0)
  - Locking mediawiki/minus-x (1.1.1)
  - Locking mediawiki/phan-taint-check-plugin (4.0.0)
  - Locking microsoft/tolerant-php-parser (v0.1.1)
  - Locking netresearch/jsonmapper (v4.1.0)
  - Locking phan/phan (5.4.1)
  - Locking php-parallel-lint/php-console-color (v1.0.1)
  - Locking php-parallel-lint/php-console-highlighter (v1.0.0)
  - Locking php-parallel-lint/php-parallel-lint (v1.3.2)
  - Locking phpdocumentor/reflection-common (2.2.0)
  - Locking phpdocumentor/reflection-docblock (5.3.0)
  - Locking phpdocumentor/type-resolver (1.6.2)
  - Locking psr/container (1.1.2)
  - Locking psr/log (1.1.4)
  - Locking sabre/event (5.1.4)
  - Locking squizlabs/php_codesniffer (3.7.2)
  - Locking symfony/console (v5.4.21)
  - Locking symfony/deprecation-contracts (v2.5.2)
  - Locking symfony/polyfill-ctype (v1.27.0)
  - Locking symfony/polyfill-intl-grapheme (v1.27.0)
  - Locking symfony/polyfill-intl-normalizer (v1.27.0)
  - Locking symfony/polyfill-mbstring (v1.27.0)
  - Locking symfony/polyfill-php73 (v1.27.0)
  - Locking symfony/polyfill-php80 (v1.27.0)
  - Locking symfony/service-contracts (v2.5.2)
  - Locking symfony/string (v5.4.21)
  - Locking tysonandre/var_representation_polyfill (0.1.3)
  - Locking webmozart/assert (1.11.0)
Writing lock file
Installing dependencies from lock file (including require-dev)
Package operations: 34 installs, 0 updates, 0 removals
    0 [>---------------------------]    0 [->--------------------------]    0 [--->------------------------]  - Installing composer/pcre (3.1.0): Extracting archive
  - Installing symfony/polyfill-php80 (v1.27.0): Extracting archive
  - Installing squizlabs/php_codesniffer (3.7.2): Extracting archive
  - Installing symfony/polyfill-mbstring (v1.27.0): Extracting archive
  - Installing composer/spdx-licenses (1.5.7): Extracting archive
  - Installing composer/semver (3.3.2): Extracting archive
  - Installing mediawiki/mediawiki-codesniffer (v41.0.0): Extracting archive
  - Installing tysonandre/var_representation_polyfill (0.1.3): Extracting archive
  - Installing symfony/polyfill-intl-normalizer (v1.27.0): Extracting archive
  - Installing symfony/polyfill-intl-grapheme (v1.27.0): Extracting archive
  - Installing symfony/polyfill-ctype (v1.27.0): Extracting archive
  - Installing symfony/string (v5.4.21): Extracting archive
  - Installing symfony/deprecation-contracts (v2.5.2): Extracting archive
  - Installing psr/container (1.1.2): Extracting archive
  - Installing symfony/service-contracts (v2.5.2): Extracting archive
  - Installing symfony/polyfill-php73 (v1.27.0): Extracting archive
  - Installing symfony/console (v5.4.21): Extracting archive
  - Installing sabre/event (5.1.4): Extracting archive
  - Installing netresearch/jsonmapper (v4.1.0): Extracting archive
  - Installing microsoft/tolerant-php-parser (v0.1.1): Extracting archive
  - Installing webmozart/assert (1.11.0): Extracting archive
  - Installing phpdocumentor/reflection-common (2.2.0): Extracting archive
  - Installing phpdocumentor/type-resolver (1.6.2): Extracting archive
  - Installing phpdocumentor/reflection-docblock (5.3.0): Extracting archive
  - Installing felixfbecker/advanced-json-rpc (v3.2.1): Extracting archive
  - Installing psr/log (1.1.4): Extracting archive
  - Installing composer/xdebug-handler (3.0.3): Extracting archive
  - Installing phan/phan (5.4.1): Extracting archive
  - Installing mediawiki/phan-taint-check-plugin (4.0.0): Extracting archive
  - Installing mediawiki/mediawiki-phan-config (0.12.0): Extracting archive
  - Installing mediawiki/minus-x (1.1.1): Extracting archive
  - Installing php-parallel-lint/php-console-color (v1.0.1): Extracting archive
  - Installing php-parallel-lint/php-console-highlighter (v1.0.0): Extracting archive
  - Installing php-parallel-lint/php-parallel-lint (v1.3.2): Extracting archive
  0/25 [>---------------------------]   0%
 10/25 [===========>----------------]  40%
 18/25 [====================>-------]  72%
 25/25 [============================] 100%4 package suggestions were added by new dependencies, use `composer suggest` to see details.
Generating autoload files
14 packages you are using are looking for funding.
Use the `composer fund` command to find out more!
--- stdout ---

--- end ---
Upgrading n:eslint-config-wikimedia from 0.20.0 -> 0.24.0
Upgrading n:grunt from 1.5.3 -> 1.6.1
Upgrading n:stylelint-config-wikimedia from 0.13.1 -> 0.14.0
$ /usr/bin/npm install
--- stdout ---

added 395 packages, and audited 396 packages in 5s

68 packages are looking for funding
  run `npm fund` for details

found 0 vulnerabilities

--- end ---
$ package-lock-lint package-lock.json
--- stdout ---
Checking package-lock.json

--- end ---
$ /usr/bin/npm install grunt-eslint@24.0.0 --save-exact
--- stdout ---

up to date, audited 396 packages in 857ms

68 packages are looking for funding
  run `npm fund` for details

found 0 vulnerabilities

--- end ---
$ package-lock-lint package-lock.json
--- stdout ---
Checking package-lock.json

--- end ---
$ ./node_modules/.bin/eslint i18n/rmc.json package.json i18n/ary.json i18n/om.json i18n/bpy.json i18n/nb.json i18n/sr-el.json i18n/myv.json i18n/pt-br.json i18n/api/nb.json i18n/api/fa.json jsduck.json i18n/kaa.json i18n/api/se.json i18n/api/pl.json i18n/sr-ec.json i18n/api/de.json i18n/fa.json tests/qunit/ext.wikispeech.ui.test.js i18n/skr-arab.json i18n/api/zh-hant.json i18n/api/uk.json i18n/de.json i18n/api/tr.json i18n/br.json modules/ext.wikispeech.userOptionsDialog.js extension.json i18n/hu.json i18n/ug-arab.json i18n/ko.json i18n/ar.json i18n/kn.json i18n/lb.json i18n/es.json i18n/zh-hans.json i18n/api/en.json package-lock.json i18n/api/mk.json i18n/api/io.json i18n/api/es.json i18n/sco.json i18n/bn.json i18n/smn.json i18n/pt.json i18n/api/eu.json i18n/av.json tests/qunit/ext.wikispeech.selectionPlayer.test.js i18n/api/sms.json i18n/ms.json i18n/ca.json i18n/api/br.json modules/ext.wikispeech.util.js i18n/api/pt.json i18n/zh-hant.json i18n/ja.json i18n/zgh.json i18n/pl.json i18n/scn.json i18n/api/hi.json tests/qunit/ext.wikispeech.test.util.js i18n/mrh.json modules/ext.wikispeech.specialEditLexicon.js Gruntfile.js i18n/sms.json i18n/hy.json modules/ext.wikispeech.storage.js i18n/rki.json i18n/yue.json i18n/api/qqq.json i18n/api/ko.json i18n/hr.json i18n/io.json i18n/api/zh-hans.json i18n/api/ig.json i18n/eu.json i18n/api/hr.json modules/ext.wikispeech.highlighter.js tests/qunit/ext.wikispeech.transcriptionPreviewer.test.js i18n/ur.json i18n/vec.json i18n/ru.json i18n/he.json i18n/api/pt-br.json i18n/my.json modules/ext.wikispeech.gadget.js i18n/da.json modules/ext.wikispeech.main.js i18n/api/sl.json i18n/api/vec.json i18n/api/fr.json i18n/ce.json i18n/qqq.json tests/qunit/ext.wikispeech.highlighter.test.js composer.json i18n/api/ar.json i18n/sh.json i18n/tr.json i18n/api/skr-arab.json sql/tables.json i18n/api/sv.json i18n/diq.json i18n/api/nl.json i18n/fr.json i18n/roa-tara.json i18n/pnb.json i18n/it.json i18n/te.json modules/ext.wikispeech.ui.js tests/qunit/ext.wikispeech.player.test.js i18n/mzn.json tests/qunit/index.js i18n/se.json i18n/mnw.json i18n/api/sh.json i18n/dag.json modules/ext.wikispeech.player.js modules/ext.wikispeech.transcriptionPreviewer.js i18n/hi.json modules/ext.wikispeech.selectionPlayer.js i18n/api/scn.json i18n/sl.json i18n/sv.json tests/qunit/ext.wikispeech.storage.test.js docs/gadget-template.js i18n/uk.json i18n/xmf.json i18n/tly.json sql/abstractSchemaChanges/patch-wikispeech_utterance-wsu_date_stored.json i18n/ku-latn.json i18n/sq.json i18n/nl.json i18n/ia.json i18n/api/sma.json i18n/arc.json modules/ext.wikispeech.loader.js i18n/api/lb.json i18n/sd.json i18n/api/he.json i18n/cs.json i18n/mk.json i18n/api/smn.json i18n/en.json i18n/fi.json i18n/api/ia.json i18n/eo.json --fix
--- stdout ---

/src/repo/docs/gadget-template.js
  8:5   warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  8:18  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  8:36  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals

/src/repo/modules/ext.wikispeech.highlighter.js
  250:5  error  ES2015 'String.prototype.normalize' method is forbidden  es-x/no-string-prototype-normalize
  252:5  error  ES2015 'String.prototype.normalize' method is forbidden  es-x/no-string-prototype-normalize

/src/repo/modules/ext.wikispeech.specialEditLexicon.js
  1:5   warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  1:16  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  1:26  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  1:42  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  1:53  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  1:58  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  2:2   warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals

/src/repo/modules/ext.wikispeech.storage.js
   998:13  warning  document.evaluate() is not supported in IE 11  compat/compat
  1002:5   warning  XPathResult is not supported in IE 11          compat/compat

/src/repo/modules/ext.wikispeech.transcriptionPreviewer.js
   1:5  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable     no-implicit-globals
  12:1  warning  Unexpected function declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals

/src/repo/tests/qunit/ext.wikispeech.player.test.js
  172:3  error  Unexpected assert.ok. Use assert.strictEqual, assert.notStrictEqual, assert.deepEqual, or assert.propEqual  qunit/no-loose-assertions
  173:3  error  Unexpected assert.ok. Use assert.strictEqual, assert.notStrictEqual, assert.deepEqual, or assert.propEqual  qunit/no-loose-assertions

✖ 18 problems (4 errors, 14 warnings)


--- end ---
$ ./node_modules/.bin/eslint i18n/rmc.json package.json i18n/ary.json i18n/om.json i18n/bpy.json i18n/nb.json i18n/sr-el.json i18n/myv.json i18n/pt-br.json i18n/api/nb.json i18n/api/fa.json jsduck.json i18n/kaa.json i18n/api/se.json i18n/api/pl.json i18n/sr-ec.json i18n/api/de.json i18n/fa.json tests/qunit/ext.wikispeech.ui.test.js i18n/skr-arab.json i18n/api/zh-hant.json i18n/api/uk.json i18n/de.json i18n/api/tr.json i18n/br.json modules/ext.wikispeech.userOptionsDialog.js extension.json i18n/hu.json i18n/ug-arab.json i18n/ko.json i18n/ar.json i18n/kn.json i18n/lb.json i18n/es.json i18n/zh-hans.json i18n/api/en.json package-lock.json i18n/api/mk.json i18n/api/io.json i18n/api/es.json i18n/sco.json i18n/bn.json i18n/smn.json i18n/pt.json i18n/api/eu.json i18n/av.json tests/qunit/ext.wikispeech.selectionPlayer.test.js i18n/api/sms.json i18n/ms.json i18n/ca.json i18n/api/br.json modules/ext.wikispeech.util.js i18n/api/pt.json i18n/zh-hant.json i18n/ja.json i18n/zgh.json i18n/pl.json i18n/scn.json i18n/api/hi.json tests/qunit/ext.wikispeech.test.util.js i18n/mrh.json modules/ext.wikispeech.specialEditLexicon.js Gruntfile.js i18n/sms.json i18n/hy.json modules/ext.wikispeech.storage.js i18n/rki.json i18n/yue.json i18n/api/qqq.json i18n/api/ko.json i18n/hr.json i18n/io.json i18n/api/zh-hans.json i18n/api/ig.json i18n/eu.json i18n/api/hr.json modules/ext.wikispeech.highlighter.js tests/qunit/ext.wikispeech.transcriptionPreviewer.test.js i18n/ur.json i18n/vec.json i18n/ru.json i18n/he.json i18n/api/pt-br.json i18n/my.json modules/ext.wikispeech.gadget.js i18n/da.json modules/ext.wikispeech.main.js i18n/api/sl.json i18n/api/vec.json i18n/api/fr.json i18n/ce.json i18n/qqq.json tests/qunit/ext.wikispeech.highlighter.test.js composer.json i18n/api/ar.json i18n/sh.json i18n/tr.json i18n/api/skr-arab.json sql/tables.json i18n/api/sv.json i18n/diq.json i18n/api/nl.json i18n/fr.json i18n/roa-tara.json i18n/pnb.json i18n/it.json i18n/te.json modules/ext.wikispeech.ui.js tests/qunit/ext.wikispeech.player.test.js i18n/mzn.json tests/qunit/index.js i18n/se.json i18n/mnw.json i18n/api/sh.json i18n/dag.json modules/ext.wikispeech.player.js modules/ext.wikispeech.transcriptionPreviewer.js i18n/hi.json modules/ext.wikispeech.selectionPlayer.js i18n/api/scn.json i18n/sl.json i18n/sv.json tests/qunit/ext.wikispeech.storage.test.js docs/gadget-template.js i18n/uk.json i18n/xmf.json i18n/tly.json sql/abstractSchemaChanges/patch-wikispeech_utterance-wsu_date_stored.json i18n/ku-latn.json i18n/sq.json i18n/nl.json i18n/ia.json i18n/api/sma.json i18n/arc.json modules/ext.wikispeech.loader.js i18n/api/lb.json i18n/sd.json i18n/api/he.json i18n/cs.json i18n/mk.json i18n/api/smn.json i18n/en.json i18n/fi.json i18n/api/ia.json i18n/eo.json -f json
--- stdout ---
[{"filePath":"/src/repo/Gruntfile.js","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/composer.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/docs/gadget-template.js","messages":[{"ruleId":"no-implicit-globals","severity":1,"message":"Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable.","line":8,"column":5,"nodeType":"VariableDeclarator","messageId":"globalNonLexicalBinding","endLine":8,"endColumn":16},{"ruleId":"no-implicit-globals","severity":1,"message":"Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable.","line":8,"column":18,"nodeType":"VariableDeclarator","messageId":"globalNonLexicalBinding","endLine":8,"endColumn":34},{"ruleId":"no-implicit-globals","severity":1,"message":"Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable.","line":8,"column":36,"nodeType":"VariableDeclarator","messageId":"globalNonLexicalBinding","endLine":8,"endColumn":45}],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":3,"fixableErrorCount":0,"fixableWarningCount":0,"source":"// Copy the code below to a gadget or to common.js / global.js. Set\n// `producerUrl` to the script path of the wiki that runs\n// Wikispeech (producer). You can get the correct URL by running the following\n// Javascript snippet in the developer console on the producer wiki:\n//\n// window.location.origin + mw.config.get( 'wgScriptPath' );\n\nvar producerUrl, parametersString, moduleUrl;\n\n// Set this to the script path on the producer wiki. Usually ends with\n// \"/w\".\nproducerUrl = 'https://.../w';\n\nmw.wikispeech = mw.wikispeech || {};\nmw.wikispeech.producerUrl = producerUrl;\nparametersString = $.param( {\n\tlang: mw.config.get( 'wgUserLanguage' ),\n\tskin: mw.config.get( 'skin' ),\n\traw: 1,\n\tsafemode: 1,\n\tmodules: 'ext.wikispeech.gadget'\n} );\nmoduleUrl = producerUrl + '/load.php?' + parametersString;\nmw.loader.load( moduleUrl );\n","usedDeprecatedRules":[]},{"filePath":"/src/repo/extension.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/ar.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/br.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/de.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/en.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/es.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/eu.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/fa.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/fr.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/he.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/hi.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/hr.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/ia.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/ig.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/io.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/ko.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/lb.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/mk.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/nb.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/nl.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/pl.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/pt-br.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/pt.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/qqq.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/scn.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/se.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/sh.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/skr-arab.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/sl.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/sma.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/smn.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/sms.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/sv.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/tr.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/uk.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/vec.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/zh-hans.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/api/zh-hant.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/ar.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/arc.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/ary.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/av.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/bn.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/bpy.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/br.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/ca.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/ce.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/cs.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/da.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/dag.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/de.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/diq.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/en.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/eo.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/es.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/eu.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/fa.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/fi.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/fr.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/he.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/hi.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/hr.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/hu.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/hy.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/ia.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/io.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/it.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/ja.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/kaa.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/kn.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/ko.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/ku-latn.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/lb.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/mk.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/mnw.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/mrh.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/ms.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/my.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/myv.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/mzn.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/nb.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/nl.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/om.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/pl.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/pnb.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/pt-br.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/pt.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/qqq.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/rki.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/rmc.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/roa-tara.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/ru.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/scn.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/sco.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/sd.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/se.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/sh.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/skr-arab.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/sl.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/smn.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/sms.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/sq.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/sr-ec.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/sr-el.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/sv.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/te.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/tly.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/tr.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/ug-arab.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/uk.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/ur.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/vec.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/xmf.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/yue.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/zgh.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/zh-hans.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/i18n/zh-hant.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/jsduck.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/modules/ext.wikispeech.gadget.js","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/modules/ext.wikispeech.highlighter.js","messages":[{"ruleId":"es-x/no-string-prototype-normalize","severity":2,"message":"ES2015 'String.prototype.normalize' method is forbidden.","line":250,"column":5,"nodeType":"MemberExpression","messageId":"forbidden","endLine":250,"endColumn":27},{"ruleId":"es-x/no-string-prototype-normalize","severity":2,"message":"ES2015 'String.prototype.normalize' method is forbidden.","line":252,"column":5,"nodeType":"MemberExpression","messageId":"forbidden","endLine":252,"endColumn":44}],"suppressedMessages":[{"ruleId":"mediawiki/class-doc","severity":2,"message":"All possible CSS classes should be documented. See https://w.wiki/PS2 for details.","line":39,"column":11,"nodeType":"CallExpression","endLine":40,"endColumn":49,"suppressions":[{"kind":"directive","justification":""}]}],"errorCount":2,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"source":"( function () {\n\n\t/**\n\t * Handles highlighting parts of the page when reciting.\n\t *\n\t * @class ext.wikispeech.Highlighter\n\t * @constructor\n\t */\n\n\tfunction Highlighter() {\n\t\tvar self = this;\n\t\tself.highlightTokenTimer = null;\n\t\tself.utteranceHighlightingClass =\n\t\t\t'ext-wikispeech-highlight-sentence';\n\t\tself.utteranceHighlightingSelector =\n\t\t\t'.' + self.utteranceHighlightingClass;\n\n\t\t/**\n\t\t * Highlight text associated with an utterance.\n\t\t *\n\t\t * Adds highlight spans to the text nodes from which the\n\t\t * tokens of the utterance were created. For first and last node,\n\t\t * it's possible that only part of the text is highlighted,\n\t\t * since they may contain start/end of next/previous\n\t\t * utterance.\n\t\t *\n\t\t * @param {Object} utterance The utterance to add\n\t\t *  highlighting to.\n\t\t */\n\n\t\tthis.highlightUtterance = function ( utterance ) {\n\t\t\tvar textNodes, span;\n\n\t\t\ttextNodes = utterance.content.map( function ( item ) {\n\t\t\t\treturn mw.wikispeech.storage.getNodeForItem( item );\n\t\t\t} );\n\t\t\t// Class name is documented above\n\t\t\t// eslint-disable-next-line mediawiki/class-doc\n\t\t\tspan = $( '<span>' )\n\t\t\t\t.addClass( self.utteranceHighlightingClass )\n\t\t\t\t.get( 0 );\n\t\t\tself.wrapTextNodes(\n\t\t\t\tspan,\n\t\t\t\ttextNodes,\n\t\t\t\tutterance.startOffset,\n\t\t\t\tutterance.endOffset\n\t\t\t);\n\t\t\t$( self.utteranceHighlightingSelector ).each( function ( i ) {\n\t\t\t\t// Save the path to the text node, as it was before\n\t\t\t\t// adding the span. This will no longer be the correct\n\t\t\t\t// path, once the span is added. This enables adding\n\t\t\t\t// token highlighting within the utterance\n\t\t\t\t// highlighting.\n\t\t\t\tthis.textPath = utterance.content[ i ].path;\n\t\t\t} );\n\t\t};\n\n\t\t/**\n\t\t * Wrap text nodes in an element.\n\t\t *\n\t\t * Each text node is wrapped in an individual copy of the\n\t\t * wrapper element. The first and last node will be partially\n\t\t * wrapped, based on the offset values.\n\t\t *\n\t\t * @param {HTMLElement} wrapper The element used to wrap the\n\t\t *  text nodes.\n\t\t * @param {Text[]} textNodes The text nodes to wrap.\n\t\t * @param {number} startOffset The start offset in the first\n\t\t *  text node.\n\t\t * @param {number} endOffset The end offset in the last text\n\t\t *  node.\n\t\t */\n\n\t\tthis.wrapTextNodes = function (\n\t\t\twrapper,\n\t\t\ttextNodes,\n\t\t\tstartOffset,\n\t\t\tendOffset\n\t\t) {\n\t\t\tvar $nodesToWrap, firstNode, i, lastNode, node;\n\n\t\t\t$nodesToWrap = $();\n\t\t\tfirstNode = textNodes[ 0 ];\n\t\t\tif ( textNodes.length === 1 ) {\n\t\t\t\t// If there is only one node that should be wrapped,\n\t\t\t\t// split it twice; once for the start and once for the\n\t\t\t\t// end offset.\n\t\t\t\tfirstNode.splitText( startOffset );\n\t\t\t\tfirstNode.nextSibling.splitText( endOffset + 1 - startOffset );\n\t\t\t\t$nodesToWrap = $nodesToWrap.add( firstNode.nextSibling );\n\t\t\t} else {\n\t\t\t\tfirstNode.splitText( startOffset );\n\t\t\t\t// The first half of a split node remains as the\n\t\t\t\t// original node. Since we want the second half, we add\n\t\t\t\t// the following node.\n\t\t\t\t$nodesToWrap = $nodesToWrap.add( firstNode.nextSibling );\n\t\t\t\tfor ( i = 1; i < textNodes.length - 1; i++ ) {\n\t\t\t\t\tnode = textNodes[ i ];\n\t\t\t\t\t// Wrap all the nodes between first and last\n\t\t\t\t\t// completely.\n\t\t\t\t\t$nodesToWrap = $nodesToWrap.add( node );\n\t\t\t\t}\n\t\t\t\tlastNode = textNodes[ textNodes.length - 1 ];\n\t\t\t\tlastNode.splitText( endOffset + 1 );\n\t\t\t\t$nodesToWrap = $nodesToWrap.add( lastNode );\n\t\t\t}\n\t\t\t$nodesToWrap.wrap( wrapper );\n\t\t};\n\n\t\t/**\n\t\t * Highlight a token in the original HTML.\n\t\t *\n\t\t * What part of the HTML to wrap is calculated from a token.\n\t\t *\n\t\t * @param {Object} token The token used to calculate what part\n\t\t *  to highlight.\n\t\t */\n\n\t\tthis.startTokenHighlighting = function ( token ) {\n\t\t\tself.removeWrappers( '.ext-wikispeech-highlight-word' );\n\t\t\tself.clearHighlightTokenTimer();\n\t\t\tself.highlightToken( token );\n\t\t\tself.setHighlightTokenTimer( token );\n\t\t};\n\n\t\t/**\n\t\t * Highlight a token in the original HTML.\n\t\t *\n\t\t * What part of the HTML to wrap is calculated from a token.\n\t\t *\n\t\t * @param {Object} token The token used to calculate what part\n\t\t *  to highlight.\n\t\t */\n\n\t\tthis.highlightToken = function ( token ) {\n\t\t\tvar span, textNodes, startOffset, endOffset;\n\n\t\t\tspan = $( '<span>' )\n\t\t\t\t.addClass( 'ext-wikispeech-highlight-word' )\n\t\t\t\t.get( 0 );\n\t\t\ttextNodes = token.items.map( function ( item ) {\n\t\t\t\tvar textNode;\n\n\t\t\t\tif ( $( self.utteranceHighlightingSelector ).length ) {\n\t\t\t\t\t// Add the token highlighting within the\n\t\t\t\t\t// utterance highlightings, if there are any.\n\t\t\t\t\ttextNode = self.getNodeInUtteranceHighlighting(\n\t\t\t\t\t\titem\n\t\t\t\t\t);\n\t\t\t\t} else {\n\t\t\t\t\ttextNode = mw.wikispeech.storage.getNodeForItem( item );\n\t\t\t\t}\n\t\t\t\treturn textNode;\n\t\t\t} );\n\t\t\tstartOffset = token.startOffset;\n\t\t\tendOffset = token.endOffset;\n\t\t\tif (\n\t\t\t\t$( self.utteranceHighlightingSelector ).length &&\n\t\t\t\t\ttoken.items[ 0 ] === token.utterance.content[ 0 ]\n\t\t\t) {\n\t\t\t\t// Modify the offset if the token is the first in the\n\t\t\t\t// utterance and there is an utterance\n\t\t\t\t// highlighting. The text node may have been split\n\t\t\t\t// when the utterance highlighting was applied.\n\t\t\t\tstartOffset -= token.utterance.startOffset;\n\t\t\t\tendOffset -= token.utterance.startOffset;\n\t\t\t}\n\t\t\tself.wrapTextNodes(\n\t\t\t\tspan,\n\t\t\t\ttextNodes,\n\t\t\t\tstartOffset,\n\t\t\t\tendOffset\n\t\t\t);\n\t\t};\n\n\t\t/**\n\t\t * Get text node, within utterance highlighting, for an item.\n\t\t *\n\t\t * @param {Object} item The item to get text node for.\n\t\t */\n\n\t\tthis.getNodeInUtteranceHighlighting = function ( item ) {\n\t\t\t// Get the text node from the utterance highlighting that\n\t\t\t// wrapped the node for `textElement`.\n\t\t\tvar textNode = $( self.utteranceHighlightingSelector )\n\t\t\t\t.filter( function () {\n\t\t\t\t\treturn this.textPath ===\n\t\t\t\t\t\titem.path;\n\t\t\t\t} )\n\t\t\t\t.contents()\n\t\t\t\t.get( 0 );\n\t\t\treturn textNode;\n\t\t};\n\n\t\t/**\n\t\t * Set a timer for when the next token should be highlighted.\n\t\t *\n\t\t * @param {Object} token The original token. The timer is set\n\t\t *  for the token following this one.\n\t\t */\n\n\t\tthis.setHighlightTokenTimer = function ( token ) {\n\t\t\tvar currentTime, duration, nextToken;\n\n\t\t\tcurrentTime = token.utterance.audio.currentTime * 1000;\n\t\t\t// The duration of the timer is the duration of the\n\t\t\t// current token.\n\t\t\tduration = token.endTime - currentTime;\n\t\t\tnextToken = mw.wikispeech.storage.getNextToken( token );\n\t\t\tif ( nextToken ) {\n\t\t\t\tself.highlightTokenTimer = window.setTimeout(\n\t\t\t\t\tfunction () {\n\t\t\t\t\t\tself.removeWrappers(\n\t\t\t\t\t\t\t'.ext-wikispeech-highlight-word'\n\t\t\t\t\t\t);\n\t\t\t\t\t\tself.highlightToken( nextToken );\n\t\t\t\t\t\t// Add a new timer for the next token, when it\n\t\t\t\t\t\t// starts playing.\n\t\t\t\t\t\tself.setHighlightTokenTimer( nextToken );\n\t\t\t\t\t},\n\t\t\t\t\tduration / mw.user.options.get( 'wikispeechSpeechRate' )\n\t\t\t\t);\n\t\t\t}\n\t\t};\n\n\t\t/**\n\t\t * Remove elements wrapping text nodes.\n\t\t *\n\t\t * Restores the text nodes to the way they were before they\n\t\t * were wrapped.\n\t\t *\n\t\t * @param {string} wrapperSelector The selector for the\n\t\t *  elements to remove\n\t\t */\n\n\t\tthis.removeWrappers = function ( wrapperSelector ) {\n\t\t\tvar parents, $span;\n\n\t\t\tparents = [];\n\t\t\t$span = $( wrapperSelector );\n\t\t\t$span.each( function () {\n\t\t\t\tparents.push( this.parentNode );\n\t\t\t} );\n\t\t\t$span.contents().unwrap();\n\t\t\tif ( parents.length > 0 ) {\n\t\t\t\t// Merge first and last text nodes, if the original was\n\t\t\t\t// divided by adding the <span>.\n\t\t\t\t// ESLint thinks this is String.normalize, not Text.normalize\n\n\t\t\t\tparents[ 0 ].normalize();\n\n\t\t\t\tparents[ parents.length - 1 ].normalize();\n\t\t\t}\n\t\t};\n\n\t\t/**\n\t\t * Remove any sentence and word highlighting.\n\t\t */\n\n\t\tthis.clearHighlighting = function () {\n\t\t\t// Remove sentence highlighting.\n\t\t\tself.removeWrappers( '.ext-wikispeech-highlight-sentence' );\n\t\t\t// Remove word highlighting.\n\t\t\tself.removeWrappers( '.ext-wikispeech-highlight-word' );\n\t\t\tself.clearHighlightTokenTimer();\n\t\t};\n\n\t\t/**\n\t\t * Clear the timer for highlighting tokens.\n\t\t */\n\n\t\tthis.clearHighlightTokenTimer = function () {\n\t\t\tclearTimeout( self.highlightTokenTimer );\n\t\t};\n\t}\n\n\tmw.wikispeech = mw.wikispeech || {};\n\tmw.wikispeech.highlighter = new Highlighter();\n\tmw.wikispeech.Highlighter = Highlighter;\n}() );\n","usedDeprecatedRules":[]},{"filePath":"/src/repo/modules/ext.wikispeech.loader.js","messages":[],"suppressedMessages":[{"ruleId":"no-jquery/no-global-selector","severity":2,"message":"Avoid queries which search the entire DOM. Keep DOM nodes in memory where possible.","line":10,"column":2,"nodeType":"CallExpression","endLine":10,"endColumn":33,"suppressions":[{"kind":"directive","justification":""}]}],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/modules/ext.wikispeech.main.js","messages":[],"suppressedMessages":[{"ruleId":"no-jquery/no-global-selector","severity":2,"message":"Avoid queries which search the entire DOM. Keep DOM nodes in memory where possible.","line":44,"column":26,"nodeType":"CallExpression","endLine":44,"endColumn":57,"suppressions":[{"kind":"directive","justification":""}]}],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/modules/ext.wikispeech.player.js","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/modules/ext.wikispeech.selectionPlayer.js","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/modules/ext.wikispeech.specialEditLexicon.js","messages":[{"ruleId":"no-implicit-globals","severity":1,"message":"Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable.","line":1,"column":5,"nodeType":"VariableDeclarator","messageId":"globalNonLexicalBinding","endLine":1,"endColumn":14},{"ruleId":"no-implicit-globals","severity":1,"message":"Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable.","line":1,"column":16,"nodeType":"VariableDeclarator","messageId":"globalNonLexicalBinding","endLine":1,"endColumn":24},{"ruleId":"no-implicit-globals","severity":1,"message":"Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable.","line":1,"column":26,"nodeType":"VariableDeclarator","messageId":"globalNonLexicalBinding","endLine":1,"endColumn":40},{"ruleId":"no-implicit-globals","severity":1,"message":"Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable.","line":1,"column":42,"nodeType":"VariableDeclarator","messageId":"globalNonLexicalBinding","endLine":1,"endColumn":51},{"ruleId":"no-implicit-globals","severity":1,"message":"Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable.","line":1,"column":53,"nodeType":"VariableDeclarator","messageId":"globalNonLexicalBinding","endLine":1,"endColumn":56},{"ruleId":"no-implicit-globals","severity":1,"message":"Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable.","line":1,"column":58,"nodeType":"VariableDeclarator","messageId":"globalNonLexicalBinding","endLine":1,"endColumn":72},{"ruleId":"no-implicit-globals","severity":1,"message":"Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable.","line":2,"column":2,"nodeType":"VariableDeclarator","messageId":"globalNonLexicalBinding","endLine":2,"endColumn":11}],"suppressedMessages":[{"ruleId":"no-jquery/no-global-selector","severity":2,"message":"Avoid queries which search the entire DOM. Keep DOM nodes in memory where possible.","line":6,"column":12,"nodeType":"CallExpression","endLine":6,"endColumn":35,"suppressions":[{"kind":"directive","justification":""}]}],"errorCount":0,"fatalErrorCount":0,"warningCount":7,"fixableErrorCount":0,"fixableWarningCount":0,"source":"var Previewer, $content, $transcription, $language, api, $previewPlayer,\n\tpreviewer;\n\nPreviewer = require( './ext.wikispeech.transcriptionPreviewer.js' );\n// eslint-disable-next-line no-jquery/no-global-selector\n$content = $( '#mw-content-text' );\n$language = $content.find( '#ext-wikispeech-language' ).find( 'select, input' );\n$transcription = $content.find( '#ext-wikispeech-transcription input' );\napi = new mw.Api();\n$previewPlayer = $( '<audio>' ).insertAfter( $transcription );\npreviewer = new Previewer( $language, $transcription, api, $previewPlayer );\n\n$content.find( '#ext-wikispeech-preview-button' ).on(\n\t'click',\n\tpreviewer.play.bind( previewer )\n);\n","usedDeprecatedRules":[]},{"filePath":"/src/repo/modules/ext.wikispeech.storage.js","messages":[{"ruleId":"compat/compat","severity":1,"message":"document.evaluate() is not supported in IE 11","line":998,"column":13,"nodeType":"MemberExpression","endLine":998,"endColumn":30},{"ruleId":"compat/compat","severity":1,"message":"XPathResult is not supported in IE 11","line":1002,"column":5,"nodeType":"MemberExpression","endLine":1002,"endColumn":40}],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":2,"fixableErrorCount":0,"fixableWarningCount":0,"source":"( function () {\n\n\t/**\n\t * Loads and stores objects used by the extension.\n\t *\n\t * Contains functions for other modules to retrieve information\n\t * about the utterances.\n\t *\n\t * @class ext.wikispeech.Storage\n\t * @constructor\n\t */\n\n\tfunction Storage() {\n\t\tvar self, producerApiUrl;\n\n\t\tself = this;\n\t\tself.utterances = [];\n\t\tself.utterancesLoaded = $.Deferred();\n\n\t\tif ( mw.wikispeech.consumerMode ) {\n\t\t\tproducerApiUrl = mw.wikispeech.producerUrl + '/api.php';\n\t\t\tself.api = new mw.ForeignApi( producerApiUrl );\n\t\t} else {\n\t\t\tself.api = new mw.Api();\n\t\t}\n\n\t\t/**\n\t\t * Load all utterances.\n\t\t *\n\t\t * Uses the MediaWiki API to get the segments of the text.\n\t\t *\n\t\t * @param {Object} window\n\t\t */\n\n\t\tthis.loadUtterances = function ( window ) {\n\t\t\tvar page, options;\n\n\t\t\tpage = mw.config.get( 'wgPageName' );\n\t\t\toptions = {\n\t\t\t\taction: 'wikispeech-segment',\n\t\t\t\tpage: page\n\t\t\t};\n\t\t\tif ( mw.wikispeech.consumerMode ) {\n\t\t\t\toptions[ 'consumer-url' ] = window.location.origin +\n\t\t\t\t\tmw.config.get( 'wgScriptPath' );\n\t\t\t}\n\t\t\tself.api.get(\n\t\t\t\toptions,\n\t\t\t\t{\n\t\t\t\t\tbeforeSend: function ( jqXHR, settings ) {\n\t\t\t\t\t\tmw.log(\n\t\t\t\t\t\t\t'Requesting segments:', settings.url\n\t\t\t\t\t\t);\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t).done( function ( data ) {\n\t\t\t\tvar utterance, i, titleUtterance, firstNode,\n\t\t\t\t\tleadingWhitespaces, offset;\n\n\t\t\t\tmw.log( 'Segments received:', data );\n\t\t\t\tself.utterances = data[ 'wikispeech-segment' ].segments;\n\n\t\t\t\t// Add extra offset to the title if it has leading\n\t\t\t\t// whitespaces. When using the new skin, there are\n\t\t\t\t// whitespaces around the title that do not appear in\n\t\t\t\t// the display title. This leads to highlighting being\n\t\t\t\t// wrong.\n\t\t\t\ttitleUtterance = self.utterances[ 0 ];\n\t\t\t\tfirstNode = self.getNodeForItem( titleUtterance.content[ 0 ] );\n\t\t\t\tleadingWhitespaces = firstNode.textContent.match( /^\\s+/ );\n\t\t\t\tif ( leadingWhitespaces ) {\n\t\t\t\t\toffset = leadingWhitespaces[ 0 ].length;\n\t\t\t\t\ttitleUtterance.startOffset += offset;\n\t\t\t\t\ttitleUtterance.endOffset += offset;\n\t\t\t\t}\n\n\t\t\t\tfor ( i = 0; i < self.utterances.length; i++ ) {\n\t\t\t\t\tutterance = self.utterances[ i ];\n\t\t\t\t\tutterance.audio = $( '<audio>' ).get( 0 );\n\t\t\t\t}\n\t\t\t\tself.utterancesLoaded.resolve();\n\t\t\t\tself.prepareUtterance( self.utterances[ 0 ] );\n\t\t\t} );\n\t\t};\n\n\t\t/**\n\t\t * Prepare an utterance for playback.\n\t\t *\n\t\t * Audio for the utterance is requested from the Speechoid service\n\t\t * and event listeners are added. When an utterance starts\n\t\t * playing, the next one is prepared, and when an utterance is\n\t\t * done, the next utterance is played. This is meant to be a\n\t\t * balance between not having to pause between utterance and\n\t\t * not requesting more than needed.\n\t\t *\n\t\t * @param {Object} utterance The utterance to prepare.\n\t\t * @return {jQuery.Promise}\n\t\t */\n\n\t\tthis.prepareUtterance = function ( utterance ) {\n\t\t\tvar $audio, nextUtterance;\n\n\t\t\t$audio = $( utterance.audio );\n\t\t\tif ( !utterance.request ) {\n\t\t\t\t// Add event listener only once.\n\t\t\t\t$audio.on( 'playing', function () {\n\t\t\t\t\tvar firstToken;\n\n\t\t\t\t\t// Highlight token only when the audio starts\n\t\t\t\t\t// playing, since we need the token info from the\n\t\t\t\t\t// response to know what to highlight.\n\t\t\t\t\tif (\n\t\t\t\t\t\t!mw.wikispeech.player.playingSelection &&\n\t\t\t\t\t\t\t$audio.prop( 'currentTime' ) === 0\n\t\t\t\t\t) {\n\t\t\t\t\t\tfirstToken = utterance.tokens[ 0 ];\n\t\t\t\t\t\tmw.wikispeech.highlighter.startTokenHighlighting(\n\t\t\t\t\t\t\tfirstToken\n\t\t\t\t\t\t);\n\t\t\t\t\t}\n\t\t\t\t} );\n\t\t\t\tnextUtterance = self.getNextUtterance( utterance );\n\t\t\t\tif ( nextUtterance ) {\n\t\t\t\t\t$audio.on( {\n\t\t\t\t\t\tplay: function () {\n\t\t\t\t\t\t\tself.prepareUtterance( nextUtterance );\n\t\t\t\t\t\t},\n\t\t\t\t\t\tended: function () {\n\t\t\t\t\t\t\tmw.wikispeech.player.skipAheadUtterance();\n\t\t\t\t\t\t}\n\t\t\t\t\t} );\n\t\t\t\t} else {\n\t\t\t\t\t// For last utterance, just stop the playback when\n\t\t\t\t\t// done.\n\t\t\t\t\t$audio.on( 'ended', function () {\n\t\t\t\t\t\tmw.wikispeech.player.stop();\n\t\t\t\t\t} );\n\t\t\t\t}\n\t\t\t}\n\t\t\tif ( !utterance.request || utterance.request.state() === 'rejected' ) {\n\t\t\t\t// Only load audio for an utterance if it hasn't been\n\t\t\t\t// successfully loaded yet.\n\t\t\t\tutterance.request = self.loadAudio( utterance );\n\t\t\t}\n\t\t\treturn utterance.request;\n\t\t};\n\n\t\t/**\n\t\t * Load audio for an utterance.\n\t\t *\n\t\t * Sends a request to the Speechoid service and adds audio and tokens\n\t\t * when the response is received.\n\t\t *\n\t\t * @param {Object} utterance The utterance to load audio for.\n\t\t * @return {jQuery.Promise}\n\t\t */\n\n\t\tthis.loadAudio = function ( utterance ) {\n\t\t\tvar audioUrl, utteranceIndex;\n\n\t\t\tutteranceIndex = self.utterances.indexOf( utterance );\n\t\t\tmw.log(\n\t\t\t\t'Loading audio for utterance #' + utteranceIndex + ':',\n\t\t\t\tutterance\n\t\t\t);\n\t\t\treturn self.requestTts( utterance.hash, window )\n\t\t\t\t.done( function ( response ) {\n\t\t\t\t\taudioUrl = 'data:audio/ogg;base64,' +\n\t\t\t\t\t\tresponse[ 'wikispeech-listen' ].audio;\n\t\t\t\t\tmw.log(\n\t\t\t\t\t\t'Setting audio url for: [' + utteranceIndex + ']',\n\t\t\t\t\t\tutterance, '=',\n\t\t\t\t\t\tresponse[ 'wikispeech-listen' ].audio.length + ' base64 bytes'\n\t\t\t\t\t);\n\t\t\t\t\t$( utterance.audio ).attr( 'src', audioUrl );\n\t\t\t\t\tutterance.audio.playbackRate =\n\t\t\t\t\t\tmw.user.options.get( 'wikispeechSpeechRate' );\n\t\t\t\t\tself.addTokens( utterance, response[ 'wikispeech-listen' ].tokens );\n\t\t\t\t} );\n\t\t};\n\n\t\t/**\n\t\t * Send a request to the Speechoid service.\n\t\t *\n\t\t * Request is sent via the \"wikispeech-listen\" API action. Language to\n\t\t * use is retrieved from the current page.\n\t\t *\n\t\t * @param {string} segmentHash\n\t\t * @param {Object} window\n\t\t * @return {jQuery.Promise}\n\t\t */\n\n\t\tthis.requestTts = function ( segmentHash, window ) {\n\t\t\tvar request, language, voice, options;\n\n\t\t\tlanguage = mw.config.get( 'wgPageContentLanguage' );\n\t\t\tvoice = mw.wikispeech.util.getUserVoice( language );\n\t\t\toptions = {\n\t\t\t\taction: 'wikispeech-listen',\n\t\t\t\tlang: language,\n\t\t\t\trevision: mw.config.get( 'wgRevisionId' ),\n\t\t\t\tsegment: segmentHash\n\t\t\t};\n\t\t\tif ( voice !== '' ) {\n\t\t\t\t// Set voice if not default.\n\t\t\t\toptions.voice = voice;\n\t\t\t}\n\t\t\tif ( mw.wikispeech.consumerMode ) {\n\t\t\t\toptions[ 'consumer-url' ] = window.location.origin +\n\t\t\t\t\tmw.config.get( 'wgScriptPath' );\n\t\t\t}\n\t\t\trequest = self.api.get(\n\t\t\t\toptions,\n\t\t\t\t{\n\t\t\t\t\tbeforeSend: function ( jqXHR, settings ) {\n\t\t\t\t\t\tmw.log(\n\t\t\t\t\t\t\t'Sending TTS request: ' + settings.url\n\t\t\t\t\t\t);\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t)\n\t\t\t\t.done( function ( data ) {\n\t\t\t\t\tmw.log( 'Response received:', data );\n\t\t\t\t} );\n\t\t\treturn request;\n\t\t};\n\n\t\t/**\n\t\t * Add tokens to an utterance.\n\t\t *\n\t\t * @param {Object} utterance The utterance to add tokens to.\n\t\t * @param {Object[]} responseTokens Tokens from a Speechoid response,\n\t\t *  where each token is an object. For these objects, the\n\t\t *  property \"orth\" is the string used by the TTS to generate\n\t\t *  audio for the token.\n\t\t */\n\n\t\tthis.addTokens = function ( utterance, responseTokens ) {\n\t\t\tvar i, token, startTime, searchOffset, responseToken;\n\n\t\t\tutterance.tokens = [];\n\t\t\tsearchOffset = 0;\n\t\t\tfor ( i = 0; i < responseTokens.length; i++ ) {\n\t\t\t\tresponseToken = responseTokens[ i ];\n\t\t\t\tif ( i === 0 ) {\n\t\t\t\t\t// The first token in an utterance always start on\n\t\t\t\t\t// time zero.\n\t\t\t\t\tstartTime = 0;\n\t\t\t\t} else {\n\t\t\t\t\t// Since the response only contains end times for\n\t\t\t\t\t// token, the start time for a token is set to the\n\t\t\t\t\t// end time of the previous one.\n\t\t\t\t\tstartTime = responseTokens[ i - 1 ].endtime;\n\t\t\t\t}\n\t\t\t\ttoken = {\n\t\t\t\t\tstring: responseToken.orth,\n\t\t\t\t\tstartTime: startTime,\n\t\t\t\t\tendTime: responseToken.endtime,\n\t\t\t\t\tutterance: utterance\n\t\t\t\t};\n\t\t\t\tutterance.tokens.push( token );\n\t\t\t\tif ( i > 0 ) {\n\t\t\t\t\t// Start looking for the next token after the\n\t\t\t\t\t// previous one, except for the first token, where\n\t\t\t\t\t// we want to start on zero.\n\t\t\t\t\tsearchOffset += 1;\n\t\t\t\t}\n\t\t\t\tsearchOffset = self.addOffsetsAndItems(\n\t\t\t\t\ttoken,\n\t\t\t\t\tsearchOffset\n\t\t\t\t);\n\t\t\t}\n\t\t};\n\n\t\t/**\n\t\t * Add properties for offsets and items to a token.\n\t\t *\n\t\t * The offsets are for the start and end of the token in the\n\t\t * text node which they appear. These text nodes are not\n\t\t * necessary the same.\n\t\t *\n\t\t * The items store information used to get the text nodes in\n\t\t * which the token starts, ends and any text nodes in between.\n\t\t *\n\t\t * @param {Object} token The token to add properties to.\n\t\t * @param {number} searchOffset The offset to start searching\n\t\t *  from, in the concatenated string.\n\t\t * @return {number} The end offset in the concatenated string.\n\t\t */\n\n\t\tthis.addOffsetsAndItems = function (\n\t\t\ttoken,\n\t\t\tsearchOffset\n\t\t) {\n\t\t\tvar startOffsetInUtteranceString,\n\t\t\t\tendOffsetInUtteranceString, endOffsetForItem,\n\t\t\t\tfirstItemIndex, itemsBeforeStart, lastItemIndex,\n\t\t\t\titemsBeforeEnd, items, itemsBeforeStartLength,\n\t\t\t\titemsBeforeEndLength, utterance;\n\n\t\t\tutterance = token.utterance;\n\t\t\titems = [];\n\t\t\tstartOffsetInUtteranceString =\n\t\t\t\tself.getStartOffsetInUtteranceString(\n\t\t\t\t\ttoken.string,\n\t\t\t\t\tutterance.content,\n\t\t\t\t\titems,\n\t\t\t\t\tsearchOffset\n\t\t\t\t);\n\t\t\tendOffsetInUtteranceString =\n\t\t\t\tstartOffsetInUtteranceString +\n\t\t\t\ttoken.string.length - 1;\n\n\t\t\t// `items` now contains all the items in the utterance,\n\t\t\t// from the first one to the last, that contains at least\n\t\t\t// part of the token. To get only the ones that contain\n\t\t\t// part of the token, the items that appear before the\n\t\t\t// token are removed.\n\t\t\tendOffsetForItem = 0;\n\t\t\titems =\n\t\t\t\titems.filter( function ( item ) {\n\t\t\t\t\tendOffsetForItem += item.string.length;\n\t\t\t\t\treturn endOffsetForItem >\n\t\t\t\t\t\tstartOffsetInUtteranceString;\n\t\t\t\t} );\n\t\t\ttoken.items = items;\n\n\t\t\t// Calculate start and end offset for the token, in the\n\t\t\t// text nodes it appears in, and add them to the\n\t\t\t// token.\n\t\t\tfirstItemIndex =\n\t\t\t\tutterance.content.indexOf( items[ 0 ] );\n\t\t\titemsBeforeStart =\n\t\t\t\tutterance.content.slice( 0, firstItemIndex );\n\t\t\titemsBeforeStartLength = 0;\n\t\t\titemsBeforeStart.forEach( function ( item ) {\n\t\t\t\titemsBeforeStartLength += item.string.length;\n\t\t\t} );\n\t\t\ttoken.startOffset =\n\t\t\t\tstartOffsetInUtteranceString -\n\t\t\t\titemsBeforeStartLength;\n\t\t\tif ( token.items[ 0 ] === utterance.content[ 0 ] ) {\n\t\t\t\ttoken.startOffset += utterance.startOffset;\n\t\t\t}\n\t\t\tlastItemIndex =\n\t\t\t\tutterance.content.indexOf(\n\t\t\t\t\tmw.wikispeech.util.getLast( items )\n\t\t\t\t);\n\t\t\titemsBeforeEnd = utterance.content.slice( 0, lastItemIndex );\n\t\t\titemsBeforeEndLength = 0;\n\t\t\titemsBeforeEnd.forEach( function ( item ) {\n\t\t\t\titemsBeforeEndLength += item.string.length;\n\t\t\t} );\n\t\t\ttoken.endOffset =\n\t\t\t\tendOffsetInUtteranceString - itemsBeforeEndLength;\n\t\t\tif (\n\t\t\t\tmw.wikispeech.util.getLast( token.items ) ===\n\t\t\t\t\tutterance.content[ 0 ]\n\t\t\t) {\n\t\t\t\ttoken.endOffset += utterance.startOffset;\n\t\t\t}\n\t\t\treturn endOffsetInUtteranceString;\n\t\t};\n\n\t\t/**\n\t\t * Calculate the start offset of a token in the utterance string.\n\t\t *\n\t\t * The token is the first match found, starting at\n\t\t * searchOffset.\n\t\t *\n\t\t * @param {string} token The token to search for.\n\t\t * @param {Object[]} content The content of the utterance where\n\t\t *  the token appear.\n\t\t * @param {Object[]} items An array of items to which each\n\t\t *  item, up to and including the last one that contains\n\t\t *  part of the token, is added.\n\t\t * @param {number} searchOffset Where we want to start looking\n\t\t *  for the token in the utterance string.\n\t\t * @return {number} The offset where the first character of\n\t\t *  the token appears in the utterance string.\n\t\t */\n\n\t\tthis.getStartOffsetInUtteranceString = function (\n\t\t\ttoken,\n\t\t\tcontent,\n\t\t\titems,\n\t\t\tsearchOffset\n\t\t) {\n\t\t\tvar concatenatedText, startOffsetInUtteranceString,\n\t\t\t\tstringBeforeReplace;\n\n\t\t\t// The concatenation of the strings from items. Used to\n\t\t\t// find tokens that span multiple text nodes.\n\t\t\tconcatenatedText = '';\n\t\t\tcontent.every( function ( item ) {\n\t\t\t\t// Look through the items until we find a substring\n\t\t\t\t// matching the token.\n\t\t\t\t// The `replaceAll` replaces non-breaking space with a\n\t\t\t\t// normal space. This is required if Speechoid returns\n\t\t\t\t// normal spaces in \"orth\" for a token. See\n\t\t\t\t// https://phabricator.wikimedia.org/T286997\n\t\t\t\tconcatenatedText += item.string.replace( ' ', ' ' );\n\n\t\t\t\t// Eslint does not allow replaceAll().\n\t\t\t\tdo {\n\t\t\t\t\tstringBeforeReplace = concatenatedText;\n\t\t\t\t\tconcatenatedText = concatenatedText.replace( ' ', ' ' );\n\t\t\t\t} while ( stringBeforeReplace !== concatenatedText );\n\n\t\t\t\titems.push( item );\n\t\t\t\tif ( searchOffset > concatenatedText.length ) {\n\t\t\t\t\t// Don't look in text elements that end before\n\t\t\t\t\t// where we start looking.\n\t\t\t\t\t// continue\n\t\t\t\t\treturn true;\n\t\t\t\t}\n\t\t\t\tstartOffsetInUtteranceString = concatenatedText.indexOf(\n\t\t\t\t\ttoken, searchOffset\n\t\t\t\t);\n\t\t\t\tif ( startOffsetInUtteranceString >= 0 ) {\n\t\t\t\t\t// break\n\t\t\t\t\treturn false;\n\t\t\t\t}\n\t\t\t\treturn true;\n\t\t\t} );\n\t\t\treturn startOffsetInUtteranceString;\n\t\t};\n\n\t\t/**\n\t\t * Get the utterance after the given utterance.\n\t\t *\n\t\t * @param {Object} utterance The original utterance.\n\t\t * @return {Object} The utterance after the original\n\t\t *  utterance. null if utterance is the last one.\n\t\t */\n\n\t\tthis.getNextUtterance = function ( utterance ) {\n\t\t\treturn self.getUtteranceByOffset( utterance, 1 );\n\t\t};\n\n\t\t/**\n\t\t * Get the utterance by offset from another utterance.\n\t\t *\n\t\t * @param {Object} utterance The original utterance.\n\t\t * @param {number} offset The difference, in index, to the\n\t\t *  wanted utterance. Can be negative for preceding\n\t\t *  utterances.\n\t\t * @return {Object} The utterance on the position before or\n\t\t *  after the original utterance, as specified by\n\t\t *  `offset`. null if the original utterance is null.\n\t\t */\n\n\t\tthis.getUtteranceByOffset = function ( utterance, offset ) {\n\t\t\tvar index;\n\n\t\t\tif ( utterance === null ) {\n\t\t\t\treturn null;\n\t\t\t}\n\t\t\tindex = self.utterances.indexOf( utterance );\n\t\t\treturn self.utterances[ index + offset ];\n\t\t};\n\n\t\t/**\n\t\t * Get the utterance before the given utterance.\n\t\t *\n\t\t * @param {Object} utterance The original utterance.\n\t\t * @return {Object} The utterance before the original\n\t\t *  utterance. null if the original utterance is the\n\t\t *  first one.\n\t\t */\n\n\t\tthis.getPreviousUtterance = function ( utterance ) {\n\t\t\treturn self.getUtteranceByOffset( utterance, -1 );\n\t\t};\n\n\t\t/**\n\t\t * Get the token following a given token.\n\t\t *\n\t\t * @param {Object} originalToken Find the next token after\n\t\t *  this one.\n\t\t * @return {Object} The first token following originalToken\n\t\t *  that has time greater than zero and a transcription. null\n\t\t *  if no such token is found. Will not look beyond\n\t\t *  originalToken's utterance.\n\t\t */\n\n\t\tthis.getNextToken = function ( originalToken ) {\n\t\t\tvar index, succeedingTokens;\n\n\t\t\tindex = originalToken.utterance.tokens.indexOf( originalToken );\n\t\t\tsucceedingTokens =\n\t\t\t\toriginalToken.utterance.tokens.slice( index + 1 ).filter(\n\t\t\t\t\tfunction ( token ) {\n\t\t\t\t\t\treturn !self.isSilent( token );\n\t\t\t\t\t} );\n\t\t\tif ( succeedingTokens.length === 0 ) {\n\t\t\t\treturn null;\n\t\t\t} else {\n\t\t\t\treturn succeedingTokens[ 0 ];\n\t\t\t}\n\t\t};\n\n\t\t/**\n\t\t * Test if a token is silent.\n\t\t *\n\t\t * Silent is here defined as either having no transcription\n\t\t * (i.e. the empty string) or having no duration (i.e. start\n\t\t * and end time is the same).\n\t\t *\n\t\t * @param {Object} token The token to test.\n\t\t * @return {boolean} true if the token is silent, else false.\n\t\t */\n\n\t\tthis.isSilent = function ( token ) {\n\t\t\treturn token.startTime === token.endTime ||\n\t\t\t\ttoken.string === '';\n\t\t};\n\n\t\t/**\n\t\t * Get the token preceding a given token.\n\t\t *\n\t\t * @param {Object} originalToken Find the token before this one.\n\t\t * @return {Object} The first token following originalToken\n\t\t *  that has time greater than zero and a transcription. null\n\t\t *  if no such token is found. Will not look beyond\n\t\t *  originalToken's utterance.\n\t\t */\n\n\t\tthis.getPreviousToken = function ( originalToken ) {\n\t\t\tvar index, precedingTokens, previousToken;\n\n\t\t\tindex = originalToken.utterance.tokens.indexOf( originalToken );\n\t\t\tprecedingTokens =\n\t\t\t\toriginalToken.utterance.tokens.slice( 0, index ).filter(\n\t\t\t\t\tfunction ( token ) {\n\t\t\t\t\t\treturn !self.isSilent( token );\n\t\t\t\t\t} );\n\t\t\tif ( precedingTokens.length === 0 ) {\n\t\t\t\treturn null;\n\t\t\t} else {\n\t\t\t\tpreviousToken = mw.wikispeech.util.getLast( precedingTokens );\n\t\t\t\treturn previousToken;\n\t\t\t}\n\t\t};\n\n\t\t/**\n\t\t * Get the last non silent token in an utterance.\n\t\t *\n\t\t * @param {Object} utterance The utterance to get the last\n\t\t *  token from.\n\t\t * @return {Object} The last token in the utterance.\n\t\t */\n\n\t\tthis.getLastToken = function ( utterance ) {\n\t\t\tvar nonSilentTokens, lastToken;\n\n\t\t\tnonSilentTokens = utterance.tokens.filter( function ( token ) {\n\t\t\t\treturn !self.isSilent( token );\n\t\t\t} );\n\t\t\tlastToken = mw.wikispeech.util.getLast( nonSilentTokens );\n\t\t\treturn lastToken;\n\t\t};\n\n\t\t/**\n\t\t * Get the first text node that is a descendant of the given node.\n\t\t *\n\t\t * Finds the depth first text node, i.e. in\n\t\t *  `<a><b>1</b>2</a>`\n\t\t * the node with text \"1\" is the first one. If the given node is\n\t\t * itself a text node, it is simply returned.\n\t\t *\n\t\t * @param {HTMLElement} node The node under which to look for\n\t\t *  text nodes.\n\t\t * @param {boolean} inUtterance If true, the first text node\n\t\t *  that is also in an utterance is returned.\n\t\t * @return {Text} The first text node under `node`,\n\t\t *  undefined if there are no text nodes.\n\t\t */\n\n\t\tthis.getFirstTextNode = function ( node, inUtterance ) {\n\t\t\tvar textNode, child, i;\n\n\t\t\tif ( node.nodeType === 3 ) {\n\t\t\t\tif ( !inUtterance || self.isNodeInUtterance( node ) ) {\n\t\t\t\t\t// The given node is a text node. Check whether\n\t\t\t\t\t// the node is in an utterance, if that is\n\t\t\t\t\t// requested.\n\t\t\t\t\treturn node;\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tfor ( i = 0; i < node.childNodes.length; i++ ) {\n\t\t\t\t\t// Check children if the given node is an element.\n\t\t\t\t\tchild = node.childNodes[ i ];\n\t\t\t\t\ttextNode = self.getFirstTextNode( child, inUtterance );\n\t\t\t\t\tif ( textNode ) {\n\t\t\t\t\t\treturn textNode;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t};\n\n\t\t/**\n\t\t * Check if a text node is in any utterance.\n\t\t *\n\t\t * Utterances don't have any direct references to nodes, but\n\t\t * rather use XPath expressions to find the nodes that were used\n\t\t * when creating them.\n\t\t *\n\t\t * @param {Text} node The text node to check.\n\t\t * @return {boolean} true if the node is in any utterance, else false.\n\t\t */\n\n\t\tthis.isNodeInUtterance = function ( node ) {\n\t\t\tvar utterance, item, i, j;\n\n\t\t\tfor (\n\t\t\t\ti = 0;\n\t\t\t\ti < self.utterances.length;\n\t\t\t\ti++\n\t\t\t) {\n\t\t\t\tutterance = self.utterances[ i ];\n\t\t\t\tfor ( j = 0; j < utterance.content.length; j++ ) {\n\t\t\t\t\titem = utterance.content[ j ];\n\t\t\t\t\tif ( self.getNodeForItem( item ) === node ) {\n\t\t\t\t\t\treturn true;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t\treturn false;\n\t\t};\n\n\t\t/**\n\t\t * Get the utterance containing a point, searching forward.\n\t\t *\n\t\t * Finds the utterance that contains a point in the text,\n\t\t * specified by a node and an offset in that node. Several\n\t\t * utterances may contain parts of the same node, which is why\n\t\t * the offset is needed.\n\t\t *\n\t\t * If the offset can't be found in the given node, later nodes\n\t\t * are checked. This happens if the offset falls between two\n\t\t * utterances.\n\t\t *\n\t\t * @param {Text} node The first node to check.\n\t\t * @param {number} offset The offset in the node.\n\t\t * @return {Object} The matching utterance.\n\t\t */\n\n\t\tthis.getStartUtterance = function ( node, offset ) {\n\t\t\tvar utterance, i, nextTextNode;\n\n\t\t\tfor ( ; offset < node.textContent.length; offset++ ) {\n\t\t\t\tfor (\n\t\t\t\t\ti = 0;\n\t\t\t\t\ti < self.utterances.length;\n\t\t\t\t\ti++\n\t\t\t\t) {\n\t\t\t\t\tutterance = self.utterances[ i ];\n\t\t\t\t\tif (\n\t\t\t\t\t\tself.isPointInItems(\n\t\t\t\t\t\t\tnode,\n\t\t\t\t\t\t\tutterance.content,\n\t\t\t\t\t\t\toffset,\n\t\t\t\t\t\t\tutterance.startOffset,\n\t\t\t\t\t\t\tutterance.endOffset\n\t\t\t\t\t\t)\n\t\t\t\t\t) {\n\t\t\t\t\t\treturn utterance;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t\t// No match found in the given node, check the next one.\n\t\t\tnextTextNode = self.getNextTextNode( node );\n\t\t\treturn self.getStartUtterance( nextTextNode, 0 );\n\t\t};\n\n\t\t/**\n\t\t * Check if a point in the text is in any of a number of items.\n\t\t *\n\t\t * Checks if a node is present in any of the items. When a\n\t\t * matching item is found, checks if the offset falls between\n\t\t * the given min and max values.\n\t\t *\n\t\t * @param {Text} node The node to check.\n\t\t * @param {Object[]} items Item objects containing a path to\n\t\t *  the node they were created from.\n\t\t * @param {number} offset Offset in the node.\n\t\t * @param {number} minOffset The minimum offset to be\n\t\t *  considered a match.\n\t\t * @param {number} maxOffset The maximum offset to be\n\t\t *  considered a match.\n\t\t */\n\n\t\tthis.isPointInItems = function (\n\t\t\tnode,\n\t\t\titems,\n\t\t\toffset,\n\t\t\tminOffset,\n\t\t\tmaxOffset\n\t\t) {\n\t\t\tvar item, i, index;\n\n\t\t\tif ( items.length === 1 ) {\n\t\t\t\titem = items[ 0 ];\n\t\t\t\tif (\n\t\t\t\t\tself.getNodeForItem( item ) === node &&\n\t\t\t\t\t\toffset >= minOffset &&\n\t\t\t\t\t\toffset <= maxOffset\n\t\t\t\t) {\n\t\t\t\t\t// Just check if the offset is within the min and\n\t\t\t\t\t// max offsets, if there is only one item.\n\t\t\t\t\treturn true;\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tfor ( i = 0; i < items.length; i++ ) {\n\t\t\t\t\titem = items[ i ];\n\t\t\t\t\tif ( self.getNodeForItem( item ) !== node ) {\n\t\t\t\t\t\t// Skip items that don't match the node we're\n\t\t\t\t\t\t// looking for.\n\t\t\t\t\t\tcontinue;\n\t\t\t\t\t}\n\t\t\t\t\tindex = items.indexOf( item );\n\t\t\t\t\tif ( index === 0 ) {\n\t\t\t\t\t\tif ( offset >= minOffset ) {\n\t\t\t\t\t\t\t// For the first node, check if position is\n\t\t\t\t\t\t\t// after the start of the utterance.\n\t\t\t\t\t\t\treturn true;\n\t\t\t\t\t\t}\n\t\t\t\t\t} else if ( index === items.length - 1 ) {\n\t\t\t\t\t\tif ( offset <= maxOffset ) {\n\t\t\t\t\t\t\t// For the last node, check if position is\n\t\t\t\t\t\t\t// before end of utterance.\n\t\t\t\t\t\t\treturn true;\n\t\t\t\t\t\t}\n\t\t\t\t\t} else {\n\t\t\t\t\t\t// Any other node should be entirely within the\n\t\t\t\t\t\t// utterance.\n\t\t\t\t\t\treturn true;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t\treturn false;\n\t\t};\n\n\t\t/**\n\t\t * Get the first text node after a given node.\n\t\t *\n\t\t * @param {HTMLElement|Text} node Get the text node after\n\t\t * this one.\n\t\t * @return {Text} The first node after `node`.\n\t\t */\n\n\t\tthis.getNextTextNode = function ( node ) {\n\t\t\tvar nextNode, textNode, child, i;\n\n\t\t\tnextNode = node.nextSibling;\n\t\t\tif ( nextNode === null ) {\n\t\t\t\t// No more text nodes, start traversing the DOM\n\t\t\t\t// upward, checking sibling of ancestors.\n\t\t\t\treturn self.getNextTextNode( node.parentNode );\n\t\t\t} else if ( nextNode.nodeType === 1 ) {\n\t\t\t\t// Node is an element, find the first text node in\n\t\t\t\t// it's children.\n\t\t\t\tfor ( i = 0; i < nextNode.childNodes.length; i++ ) {\n\t\t\t\t\tchild = nextNode.childNodes[ i ];\n\t\t\t\t\ttextNode = self.getFirstTextNode( child );\n\t\t\t\t\tif ( textNode ) {\n\t\t\t\t\t\treturn textNode;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t\treturn self.getNextTextNode( nextNode );\n\t\t\t} else if ( nextNode.nodeType === 3 ) {\n\t\t\t\treturn nextNode;\n\t\t\t}\n\t\t};\n\n\t\t/**\n\t\t * Get the token containing a point, searching forward.\n\t\t *\n\t\t * Finds the token that contains a point in the text,\n\t\t * specified by a node and an offset in that node. Several\n\t\t * tokens may contain parts of the same node, which is why\n\t\t * the offset is needed.\n\t\t *\n\t\t * If the offset can't be found in the given node, later nodes\n\t\t * are checked. This happens if the offset falls between two\n\t\t * tokens.\n\t\t *\n\t\t * @param {Object} utterance The utterance to look for tokens in.\n\t\t * @param {Text} node The node that contains the token.\n\t\t * @param {number} offset The offset in the node.\n\t\t * @param {Object} The first token found.\n\t\t */\n\n\t\tthis.getStartToken = function ( utterance, node, offset ) {\n\t\t\tvar token, i, nextTextNode;\n\n\t\t\tfor ( ; offset < node.textContent.length; offset++ ) {\n\t\t\t\tfor ( i = 0; i < utterance.tokens.length; i++ ) {\n\t\t\t\t\ttoken = utterance.tokens[ i ];\n\t\t\t\t\tif (\n\t\t\t\t\t\tself.isPointInItems(\n\t\t\t\t\t\t\tnode,\n\t\t\t\t\t\t\ttoken.items,\n\t\t\t\t\t\t\toffset,\n\t\t\t\t\t\t\ttoken.startOffset,\n\t\t\t\t\t\t\ttoken.endOffset\n\t\t\t\t\t\t)\n\t\t\t\t\t) {\n\t\t\t\t\t\treturn token;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t\t// If token wasn't found in the given node, check the next\n\t\t\t// one.\n\t\t\tnextTextNode = self.getNextTextNode( node );\n\t\t\treturn self.getStartToken( utterance, nextTextNode, 0 );\n\t\t};\n\n\t\t/**\n\t\t * Get the last text node that is a descendant of given node.\n\t\t *\n\t\t * Finds the depth first text node, i.e. in\n\t\t *  `<a>1<b>2</b></a>`\n\t\t * the node with text \"2\" is the last one. If the given node\n\t\t * is itself a text node, it is simply returned.\n\t\t *\n\t\t * @param {HTMLElement} node The node under which to look for\n\t\t *  text nodes.\n\t\t * @param {boolean} inUtterance If true, the last text node\n\t\t *  that is also in an utterance is returned.\n\t\t * @return {Text} The last text node under `node`,\n\t\t *  undefined if there are no text nodes.\n\t\t */\n\n\t\tthis.getLastTextNode = function ( node, inUtterance ) {\n\t\t\tvar i, child, textNode;\n\n\t\t\tif ( node.nodeType === 3 ) {\n\t\t\t\tif ( !inUtterance || self.isNodeInUtterance( node ) ) {\n\t\t\t\t\t// The given node is a text node. Check whether\n\t\t\t\t\t// the node is in an utterance, if that is\n\t\t\t\t\t// requested.\n\t\t\t\t\treturn node;\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tfor ( i = node.childNodes.length - 1; i >= 0; i-- ) {\n\t\t\t\t\t// Check children if the given node is an element.\n\t\t\t\t\tchild = node.childNodes[ i ];\n\t\t\t\t\ttextNode = self.getLastTextNode( child, inUtterance );\n\t\t\t\t\tif ( textNode ) {\n\t\t\t\t\t\treturn textNode;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t};\n\n\t\t/**\n\t\t * Get the utterance containing a point, searching backward.\n\t\t *\n\t\t * Finds the utterance that contains a point in the text,\n\t\t * specified by a node and an offset in that node. Several\n\t\t * utterances may contain parts of the same node, which is why\n\t\t * the offset is needed.\n\t\t *\n\t\t * If the offset can't be found in the given node, preceding\n\t\t * nodes are checked. This happens if the offset falls between\n\t\t * two utterances.\n\t\t *\n\t\t * @param {Text} node The first node to check.\n\t\t * @param {number} offset The offset in the node.\n\t\t * @return {Object} The matching utterance.\n\t\t */\n\n\t\tthis.getEndUtterance = function ( node, offset ) {\n\t\t\tvar utterance, i, previousTextNode;\n\n\t\t\tfor ( ; offset >= 0; offset-- ) {\n\t\t\t\tfor (\n\t\t\t\t\ti = 0;\n\t\t\t\t\ti < self.utterances.length;\n\t\t\t\t\ti++\n\t\t\t\t) {\n\t\t\t\t\tutterance = self.utterances[ i ];\n\t\t\t\t\tif (\n\t\t\t\t\t\tself.isPointInItems(\n\t\t\t\t\t\t\tnode,\n\t\t\t\t\t\t\tutterance.content,\n\t\t\t\t\t\t\toffset,\n\t\t\t\t\t\t\tutterance.startOffset,\n\t\t\t\t\t\t\tutterance.endOffset\n\t\t\t\t\t\t)\n\t\t\t\t\t) {\n\t\t\t\t\t\treturn utterance;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t\tpreviousTextNode = self.getPreviousTextNode( node );\n\t\t\treturn self.getEndUtterance(\n\t\t\t\tpreviousTextNode,\n\t\t\t\tpreviousTextNode.textContent.length\n\t\t\t);\n\t\t};\n\n\t\t/**\n\t\t * Get the first text node before a given node.\n\t\t *\n\t\t * @param {HTMLElement|Text} node Get the text node before\n\t\t *  this one.\n\t\t * @return {Text} The first node before `node`.\n\t\t */\n\n\t\tthis.getPreviousTextNode = function ( node ) {\n\t\t\tvar previousNode, i, child, textNode;\n\n\t\t\tpreviousNode = node.previousSibling;\n\t\t\tif ( previousNode === null ) {\n\t\t\t\treturn self.getPreviousTextNode( node.parentNode );\n\t\t\t} else if ( previousNode.nodeType === 1 ) {\n\t\t\t\tfor (\n\t\t\t\t\ti = previousNode.childNodes.length - 1;\n\t\t\t\t\ti >= 0;\n\t\t\t\t\ti--\n\t\t\t\t) {\n\t\t\t\t\tchild = previousNode.childNodes[ i ];\n\t\t\t\t\ttextNode = self.getLastTextNode( child );\n\t\t\t\t\tif ( textNode ) {\n\t\t\t\t\t\treturn textNode;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t\treturn self.getPreviousTextNode( previousNode );\n\t\t\t} else if ( previousNode.nodeType === 3 ) {\n\t\t\t\treturn previousNode;\n\t\t\t}\n\t\t};\n\n\t\t/**\n\t\t * Get the token containing a point, searching backward.\n\t\t *\n\t\t * Finds the token that contains a point in the text,\n\t\t * specified by a node and an offset in that node. Several\n\t\t * tokens may contain parts of the same node, which is why\n\t\t * the offset is needed.\n\t\t *\n\t\t * If the offset can't be found in the given node, preceding\n\t\t * nodes are checked. This happens if the offset falls between\n\t\t * two tokens.\n\t\t *\n\t\t * @param {Object} utterance The utterance to look for tokens in.\n\t\t * @param {Text} node The node that contains the token.\n\t\t * @param {number} offset The offset in the node.\n\t\t * @param {Object} The first token found.\n\t\t */\n\n\t\tthis.getEndToken = function ( utterance, node, offset ) {\n\t\t\tvar token, i, previousTextNode;\n\n\t\t\tfor ( ; offset >= 0; offset-- ) {\n\t\t\t\tfor ( i = 0; i < utterance.tokens.length; i++ ) {\n\t\t\t\t\ttoken = utterance.tokens[ i ];\n\t\t\t\t\tif (\n\t\t\t\t\t\tself.isPointInItems(\n\t\t\t\t\t\t\tnode,\n\t\t\t\t\t\t\ttoken.items,\n\t\t\t\t\t\t\toffset,\n\t\t\t\t\t\t\ttoken.startOffset,\n\t\t\t\t\t\t\ttoken.endOffset\n\t\t\t\t\t\t)\n\t\t\t\t\t) {\n\t\t\t\t\t\treturn token;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t\tpreviousTextNode = self.getPreviousTextNode( node );\n\t\t\treturn self.getEndToken(\n\t\t\t\tutterance,\n\t\t\t\tpreviousTextNode,\n\t\t\t\tpreviousTextNode.textContent.length\n\t\t\t);\n\t\t};\n\n\t\t/**\n\t\t * Find the text node from which a content item was created.\n\t\t *\n\t\t * The path property of the item is an XPath expression\n\t\t * that is used to traverse the DOM tree.\n\t\t *\n\t\t * @param {Object} item The item to find the text node for.\n\t\t * @return {Text} The text node associated with the item.\n\t\t */\n\n\t\tthis.getNodeForItem = function ( item ) {\n\t\t\tvar node, result, contentSelector;\n\n\t\t\t// The path should be unambiguous, so just get the first\n\t\t\t// matching node.\n\t\t\tcontentSelector = mw.config.get( 'wgWikispeechContentSelector' );\n\t\t\tresult = document.evaluate(\n\t\t\t\titem.path,\n\t\t\t\t$( contentSelector ).get( 0 ),\n\t\t\t\tnull,\n\t\t\t\tXPathResult.FIRST_ORDERED_NODE_TYPE,\n\t\t\t\tnull\n\t\t\t);\n\t\t\tnode = result.singleNodeValue;\n\t\t\treturn node;\n\t\t};\n\t}\n\n\tmw.wikispeech = mw.wikispeech || {};\n\tmw.wikispeech.Storage = Storage;\n\tmw.wikispeech.storage = new Storage();\n}() );\n","usedDeprecatedRules":[]},{"filePath":"/src/repo/modules/ext.wikispeech.transcriptionPreviewer.js","messages":[{"ruleId":"no-implicit-globals","severity":1,"message":"Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable.","line":1,"column":5,"nodeType":"VariableDeclarator","messageId":"globalNonLexicalBinding","endLine":1,"endColumn":49},{"ruleId":"no-implicit-globals","severity":1,"message":"Unexpected function declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable.","line":12,"column":1,"nodeType":"FunctionDeclaration","messageId":"globalNonLexicalBinding","endLine":22,"endColumn":2}],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":2,"fixableErrorCount":0,"fixableWarningCount":0,"source":"var util = require( './ext.wikispeech.util.js' );\n\n/**\n * Generates audio preview for the transcription in SpecialEditLexicon.\n *\n * @class TranscriptionPreviewer\n * @param {jQuery} $language\n * @param {jQuery} $transcription\n * @param {mw.Api} api\n * @param {jQuery} $player\n */\nfunction TranscriptionPreviewer(\n\t$language,\n\t$transcription,\n\tapi,\n\t$player\n) {\n\tthis.$language = $language;\n\tthis.$transcription = $transcription;\n\tthis.api = api;\n\tthis.$player = $player;\n}\n\n/**\n * Play the transcription using TTS.\n *\n * If the transcription has changed since last play, a new one\n * retrieved. Otherwise the previous one is replayed.\n */\nTranscriptionPreviewer.prototype.play = function () {\n\tvar transcription = this.$transcription.val();\n\t// Rewind in case it is already playing. Just calling play() is not enought to play from start.\n\tthis.$player.prop( 'currentTime', 0 );\n\tif ( transcription !== this.lastTranscription || !this.$player.attr( 'src' ) ) {\n\t\tthis.fetchAudio();\n\t\tthis.lastTranscription = transcription;\n\t} else {\n\t\tthis.$player.get( 0 ).play();\n\t}\n};\n\n/**\n * Get audio for the player using the listen API\n */\nTranscriptionPreviewer.prototype.fetchAudio = function () {\n\tvar language, voice, transcription, self, message, title;\n\tlanguage = this.$language.val();\n\tvoice = util.getUserVoice( language );\n\ttranscription = this.$transcription.val();\n\tmw.log( 'Fetching transcription preview for (' + language + '): ' + transcription );\n\tself = this;\n\tthis.api.get( {\n\t\taction: 'wikispeech-listen',\n\t\tlang: language,\n\t\tipa: transcription,\n\t\tvoice: voice\n\t} ).done( function ( response ) {\n\t\tvar audioData = response[ 'wikispeech-listen' ].audio;\n\t\tself.$player.attr( 'src', 'data:audio/ogg;base64,' + audioData );\n\t\tself.$player.get( 0 ).play();\n\t} ).fail( function ( code, result ) {\n\t\tself.$player.attr( 'src', '' );\n\t\tmw.log.error( 'Failed to synthesize:', code, result );\n\t\tmessage = mw.msg( 'wikispeech-error-generate-preview-message' ) +\n\t\t\tresult.error.info;\n\t\ttitle = mw.msg( 'wikispeech-error-generate-preview-title' );\n\t\tOO.ui.alert( message, { title: title } );\n\t} );\n};\n\nmodule.exports = TranscriptionPreviewer;\n","usedDeprecatedRules":[]},{"filePath":"/src/repo/modules/ext.wikispeech.ui.js","messages":[],"suppressedMessages":[{"ruleId":"mediawiki/class-doc","severity":2,"message":"All possible CSS classes should be documented. See https://w.wiki/PS2 for details.","line":165,"column":41,"nodeType":"ObjectExpression","endLine":168,"endColumn":5,"suppressions":[{"kind":"directive","justification":""}]}],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/modules/ext.wikispeech.userOptionsDialog.js","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/modules/ext.wikispeech.util.js","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/package-lock.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/package.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/sql/abstractSchemaChanges/patch-wikispeech_utterance-wsu_date_stored.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/sql/tables.json","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/tests/qunit/ext.wikispeech.highlighter.test.js","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/tests/qunit/ext.wikispeech.player.test.js","messages":[{"ruleId":"qunit/no-loose-assertions","severity":2,"message":"Unexpected assert.ok. Use assert.strictEqual, assert.notStrictEqual, assert.deepEqual, or assert.propEqual.","line":172,"column":3,"nodeType":"CallExpression","messageId":"unexpectedLocalLooseAssertion","endLine":172,"endColumn":81},{"ruleId":"qunit/no-loose-assertions","severity":2,"message":"Unexpected assert.ok. Use assert.strictEqual, assert.notStrictEqual, assert.deepEqual, or assert.propEqual.","line":173,"column":3,"nodeType":"CallExpression","messageId":"unexpectedLocalLooseAssertion","endLine":173,"endColumn":82}],"suppressedMessages":[],"errorCount":2,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"source":"( function () {\n\tvar player, storage, selectionPlayer, highlighter, ui;\n\n\tQUnit.module( 'ext.wikispeech.player', {\n\t\tbeforeEach: function () {\n\t\t\tmw.wikispeech.highlighter =\n\t\t\t\tsinon.stub( new mw.wikispeech.Highlighter() );\n\t\t\thighlighter = mw.wikispeech.highlighter;\n\t\t\tmw.wikispeech.ui =\n\t\t\t\tsinon.stub( new mw.wikispeech.Ui() );\n\t\t\tui = mw.wikispeech.ui;\n\t\t\tmw.wikispeech.selectionPlayer =\n\t\t\t\tsinon.stub( new mw.wikispeech.SelectionPlayer() );\n\t\t\tselectionPlayer = mw.wikispeech.selectionPlayer;\n\t\t\tmw.wikispeech.storage =\n\t\t\t\tsinon.stub( new mw.wikispeech.Storage() );\n\t\t\tstorage = mw.wikispeech.storage;\n\t\t\tstorage.utterancesLoaded.resolve();\n\t\t\tstorage.utterances = [\n\t\t\t\t{\n\t\t\t\t\taudio: {\n\t\t\t\t\t\tplay: function () {},\n\t\t\t\t\t\tpause: function () {}\n\t\t\t\t\t},\n\t\t\t\t\tcontent: []\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\taudio: {\n\t\t\t\t\t\tplay: function () {},\n\t\t\t\t\t\tpause: function () {}\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t];\n\t\t\tplayer = new mw.wikispeech.Player();\n\t\t\tmw.config.set(\n\t\t\t\t'wgWikispeechSkipBackRewindsThreshold',\n\t\t\t\t3.0\n\t\t\t);\n\t\t\t// Base case is that there is no selection. Test that test\n\t\t\t// for when there is a selection overwrites this.\n\t\t\tselectionPlayer.playSelectionIfValid.returns( false );\n\t\t}\n\t} );\n\n\tQUnit.test( 'playOrStop(): play', function ( assert ) {\n\t\tsinon.stub( player, 'play' );\n\n\t\tplayer.playOrStop();\n\n\t\tassert.strictEqual( player.play.called, true );\n\t} );\n\n\tQUnit.test( 'playOrStop(): stop', function ( assert ) {\n\t\tplayer.currentUtterance = storage.utterances[ 0 ];\n\t\tsinon.stub( player, 'stop' );\n\n\t\tplayer.playOrStop();\n\n\t\tassert.strictEqual( player.stop.called, true );\n\t} );\n\n\tQUnit.test( 'stop()', function () {\n\t\tplayer.currentUtterance = storage.utterances[ 0 ];\n\t\tstorage.utterances[ 0 ].audio.currentTime = 1.0;\n\t\tsinon.stub( player, 'stopUtterance' );\n\n\t\tplayer.stop();\n\n\t\tsinon.assert.calledWith(\n\t\t\tplayer.stopUtterance, storage.utterances[ 0 ]\n\t\t);\n\t\tsinon.assert.called( ui.setPlayStopIconToPlay );\n\t\tsinon.assert.called( ui.hideBufferingIcon );\n\t} );\n\n\tQUnit.test( 'play()', function () {\n\t\tsinon.stub( player, 'playUtterance' );\n\n\t\tplayer.play();\n\n\t\tsinon.assert.called( player.playUtterance );\n\t} );\n\n\tQUnit.test( 'play(): delay until utterances has been loaded', function () {\n\t\tsinon.stub( player, 'playUtterance' );\n\t\t// We want an unresolved promise for this test.\n\t\tstorage.utterancesLoaded = $.Deferred();\n\n\t\tplayer.play();\n\n\t\tsinon.assert.notCalled( player.playUtterance );\n\t} );\n\n\tQUnit.test( 'play(): do not play utterance when selection is valid', function () {\n\t\tsinon.stub( player, 'playUtterance' );\n\t\tselectionPlayer.playSelectionIfValid.returns( true );\n\n\t\tplayer.play();\n\n\t\tsinon.assert.notCalled( player.playUtterance );\n\t} );\n\n\tQUnit.test( 'play(): play from beginning when selection is invalid', function () {\n\t\tsinon.stub( player, 'playUtterance' );\n\t\tselectionPlayer.playSelectionIfValid.returns( false );\n\n\t\tplayer.play();\n\n\t\tsinon.assert.calledWith(\n\t\t\tplayer.playUtterance,\n\t\t\tstorage.utterances[ 0 ]\n\t\t);\n\t} );\n\n\tQUnit.test( 'playUtterance()', function () {\n\t\tvar utterance = storage.utterances[ 0 ];\n\t\tsinon.stub( utterance.audio, 'play' );\n\t\tstorage.prepareUtterance.returns( $.Deferred().resolve() );\n\n\t\tplayer.playUtterance( utterance );\n\n\t\tsinon.assert.called( utterance.audio.play );\n\t\tsinon.assert.calledWith( highlighter.highlightUtterance, utterance );\n\t\tsinon.assert.calledWith(\n\t\t\tui.showBufferingIconIfAudioIsLoading,\n\t\t\tutterance.audio\n\t\t);\n\t} );\n\n\tQUnit.test( 'playUtterance(): stop playing utterance', function () {\n\t\tstorage.prepareUtterance.returns( $.Deferred().resolve() );\n\t\tplayer.currentUtterance = storage.utterances[ 0 ];\n\t\tsinon.stub( player, 'stopUtterance' );\n\n\t\tplayer.playUtterance( storage.utterances[ 1 ] );\n\n\t\tsinon.assert.calledWith(\n\t\t\tplayer.stopUtterance,\n\t\t\tstorage.utterances[ 0 ]\n\t\t);\n\t} );\n\n\tQUnit.test( 'playUtterance(): show load error dialog', function () {\n\t\tvar utterance = storage.utterances[ 0 ];\n\t\tstorage.prepareUtterance.returns( $.Deferred().reject() );\n\t\tui.showLoadAudioError.returns( $.Deferred() );\n\n\t\tplayer.playUtterance( utterance );\n\n\t\tsinon.assert.called( ui.showLoadAudioError );\n\t} );\n\n\tQUnit.test( 'playUtterance(): show load error dialog again', function () {\n\t\tvar utterance = storage.utterances[ 0 ];\n\t\tstorage.prepareUtterance.returns( $.Deferred().reject() );\n\t\tui.showLoadAudioError.onFirstCall().returns( $.Deferred().resolveWith( null, [ { action: 'retry' } ] ) );\n\t\tui.showLoadAudioError.returns( $.Deferred() );\n\n\t\tplayer.playUtterance( utterance );\n\n\t\tsinon.assert.calledTwice( ui.showLoadAudioError );\n\t} );\n\n\tQUnit.test( 'playUtterance(): retry preparing utterance', function ( assert ) {\n\t\tvar utterance = storage.utterances[ 0 ];\n\t\tstorage.prepareUtterance.returns( $.Deferred().reject() );\n\t\tui.showLoadAudioError.onFirstCall().returns( $.Deferred().resolveWith( null, [ { action: 'retry' } ] ) );\n\t\tui.showLoadAudioError.returns( $.Deferred().resolve() );\n\n\t\tplayer.playUtterance( utterance );\n\n\t\tassert.ok( storage.prepareUtterance.firstCall.calledWithExactly( utterance ) );\n\t\tassert.ok( storage.prepareUtterance.secondCall.calledWithExactly( utterance ) );\n\t} );\n\n\tQUnit.test( 'stopUtterance()', function ( assert ) {\n\t\tstorage.utterances[ 0 ].audio.currentTime = 1.0;\n\t\tsinon.stub( storage.utterances[ 0 ].audio, 'pause' );\n\n\t\tplayer.stopUtterance( storage.utterances[ 0 ] );\n\n\t\tsinon.assert.called( storage.utterances[ 0 ].audio.pause );\n\t\tassert.strictEqual( storage.utterances[ 0 ].audio.currentTime, 0 );\n\t\tsinon.assert.called( highlighter.clearHighlighting );\n\t\tsinon.assert.calledWith(\n\t\t\tui.removeCanPlayListener,\n\t\t\t$( storage.utterances[ 0 ].audio )\n\t\t);\n\t} );\n\n\tQUnit.test( 'skipAheadUtterance()', function () {\n\t\tsinon.stub( player, 'playUtterance' );\n\t\tstorage.getNextUtterance.returns( storage.utterances[ 1 ] );\n\n\t\tplayer.skipAheadUtterance();\n\n\t\tsinon.assert.calledWith(\n\t\t\tplayer.playUtterance,\n\t\t\tstorage.utterances[ 1 ]\n\t\t);\n\t} );\n\n\tQUnit.test( 'skipAheadUtterance(): stop if no next utterance', function () {\n\t\tsinon.stub( player, 'stop' );\n\t\tstorage.getNextUtterance.returns( null );\n\n\t\tplayer.skipAheadUtterance();\n\n\t\tsinon.assert.called( player.stop );\n\t} );\n\n\tQUnit.test( 'skipBackUtterance()', function () {\n\t\tsinon.stub( player, 'playUtterance' );\n\t\tplayer.currentUtterance = storage.utterances[ 1 ];\n\t\tstorage.getPreviousUtterance.returns( storage.utterances[ 0 ] );\n\n\t\tplayer.skipBackUtterance();\n\n\t\tsinon.assert.calledWith(\n\t\t\tplayer.playUtterance,\n\t\t\tstorage.utterances[ 0 ]\n\t\t);\n\t} );\n\n\tQUnit.test( 'skipBackUtterance(): restart if first utterance', function ( assert ) {\n\t\tplayer.currentUtterance = storage.utterances[ 0 ];\n\t\tstorage.utterances[ 0 ].audio.currentTime = 1.0;\n\t\tsinon.stub( storage.utterances[ 0 ].audio, 'pause' );\n\n\t\tplayer.skipBackUtterance();\n\n\t\tassert.strictEqual(\n\t\t\tstorage.utterances[ 0 ].audio.currentTime,\n\t\t\t0\n\t\t);\n\t\tsinon.assert.notCalled( storage.utterances[ 0 ].audio.pause );\n\t} );\n\n\tQUnit.test( 'skipBackUtterance(): restart if played long enough', function ( assert ) {\n\t\tplayer.currentUtterance = storage.utterances[ 1 ];\n\t\tstorage.utterances[ 1 ].audio.currentTime = 3.1;\n\t\tsinon.stub( player, 'playUtterance' );\n\t\tstorage.getPreviousUtterance.returns( storage.utterances[ 0 ] );\n\n\t\tplayer.skipBackUtterance();\n\n\t\tassert.strictEqual(\n\t\t\tstorage.utterances[ 1 ].audio.currentTime,\n\t\t\t0\n\t\t);\n\t\tsinon.assert.neverCalledWith(\n\t\t\tplayer.playUtterance, storage.utterances[ 0 ]\n\t\t);\n\t} );\n\n\tQUnit.test( 'getCurrentToken()', function ( assert ) {\n\t\tvar token;\n\n\t\tstorage.utterances[ 0 ].audio.src = 'loaded';\n\t\tstorage.utterances[ 0 ].tokens = [\n\t\t\t{\n\t\t\t\tstartTime: 0,\n\t\t\t\tendTime: 1000\n\t\t\t},\n\t\t\t{\n\t\t\t\tstartTime: 1000,\n\t\t\t\tendTime: 2000\n\t\t\t},\n\t\t\t{\n\t\t\t\tstartTime: 2000,\n\t\t\t\tendTime: 3000\n\t\t\t}\n\t\t];\n\t\tstorage.utterances[ 0 ].audio.currentTime = 1.1;\n\t\tstorage.utterancesLoaded.resolve();\n\t\tplayer.currentUtterance = storage.utterances[ 0 ];\n\n\t\ttoken = player.getCurrentToken();\n\n\t\tassert.strictEqual( token, storage.utterances[ 0 ].tokens[ 1 ] );\n\t} );\n\n\tQUnit.test( 'getCurrentToken(): get first token', function ( assert ) {\n\t\tvar token;\n\n\t\tstorage.utterances[ 0 ].audio.src = 'loaded';\n\t\tstorage.utterances[ 0 ].tokens = [\n\t\t\t{\n\t\t\t\tstartTime: 0,\n\t\t\t\tendTime: 1000\n\t\t\t},\n\t\t\t{\n\t\t\t\tstartTime: 1000,\n\t\t\t\tendTime: 2000\n\t\t\t},\n\t\t\t{\n\t\t\t\tstartTime: 2000,\n\t\t\t\tendTime: 3000\n\t\t\t}\n\t\t];\n\t\tstorage.utterances[ 0 ].audio.currentTime = 0.1;\n\t\tplayer.currentUtterance = storage.utterances[ 0 ];\n\n\t\ttoken = player.getCurrentToken();\n\n\t\tassert.strictEqual( token, storage.utterances[ 0 ].tokens[ 0 ] );\n\t} );\n\n\tQUnit.test( 'getCurrentToken(): get the last token', function ( assert ) {\n\t\tvar token;\n\n\t\tstorage.utterances[ 0 ].audio.src = 'loaded';\n\t\tstorage.utterances[ 0 ].tokens = [\n\t\t\t{\n\t\t\t\tstartTime: 0,\n\t\t\t\tendTime: 1000\n\t\t\t},\n\t\t\t{\n\t\t\t\tstartTime: 1000,\n\t\t\t\tendTime: 2000\n\t\t\t},\n\t\t\t{\n\t\t\t\tstartTime: 2000,\n\t\t\t\tendTime: 3000\n\t\t\t}\n\t\t];\n\t\tstorage.utterances[ 0 ].audio.currentTime = 2.1;\n\t\tplayer.currentUtterance = storage.utterances[ 0 ];\n\n\t\ttoken = player.getCurrentToken();\n\n\t\tassert.strictEqual( token, storage.utterances[ 0 ].tokens[ 2 ] );\n\t} );\n\n\tQUnit.test( 'getCurrentToken(): get the last token when current time is equal to last tokens end time', function ( assert ) {\n\t\tvar token;\n\n\t\tstorage.utterances[ 0 ].audio.src = 'loaded';\n\t\tstorage.utterances[ 0 ].tokens = [\n\t\t\t{\n\t\t\t\tstartTime: 0,\n\t\t\t\tendTime: 1000\n\t\t\t},\n\t\t\t{\n\t\t\t\tstartTime: 1000,\n\t\t\t\tendTime: 2000\n\t\t\t}\n\t\t];\n\t\tstorage.utterances[ 0 ].audio.currentTime = 2.0;\n\t\tplayer.currentUtterance = storage.utterances[ 0 ];\n\n\t\ttoken = player.getCurrentToken();\n\n\t\tassert.strictEqual( token, storage.utterances[ 0 ].tokens[ 1 ] );\n\t} );\n\n\tQUnit.test( 'getCurrentToken(): ignore tokens with no duration', function ( assert ) {\n\t\tvar token;\n\n\t\tstorage.utterances[ 0 ].audio.src = 'loaded';\n\t\tstorage.utterances[ 0 ].tokens = [\n\t\t\t{\n\t\t\t\tstartTime: 0,\n\t\t\t\tendTime: 1000\n\t\t\t},\n\t\t\t{\n\t\t\t\tstartTime: 1000,\n\t\t\t\tendTime: 1000\n\t\t\t},\n\t\t\t{\n\t\t\t\tstartTime: 1000,\n\t\t\t\tendTime: 2000\n\t\t\t}\n\t\t];\n\t\tstorage.utterances[ 0 ].audio.currentTime = 1.0;\n\t\tplayer.currentUtterance = storage.utterances[ 0 ];\n\n\t\ttoken = player.getCurrentToken();\n\n\t\tassert.strictEqual(\n\t\t\ttoken,\n\t\t\tstorage.utterances[ 0 ].tokens[ 2 ]\n\t\t);\n\t} );\n\n\tQUnit.test( 'getCurrentToken(): give correct token if there are tokens with no duration', function ( assert ) {\n\t\tvar token;\n\n\t\tstorage.utterances[ 0 ].audio.src = 'loaded';\n\t\tstorage.utterances[ 0 ].tokens = [\n\t\t\t{\n\t\t\t\tstartTime: 0,\n\t\t\t\tendTime: 1000\n\t\t\t},\n\t\t\t{\n\t\t\t\tstartTime: 1000,\n\t\t\t\tendTime: 1000\n\t\t\t},\n\t\t\t{\n\t\t\t\tstartTime: 1000,\n\t\t\t\tendTime: 2000\n\t\t\t}\n\t\t];\n\t\tstorage.utterances[ 0 ].audio.currentTime = 1.1;\n\t\tplayer.currentUtterance = storage.utterances[ 0 ];\n\n\t\ttoken = player.getCurrentToken();\n\n\t\tassert.strictEqual( token, storage.utterances[ 0 ].tokens[ 2 ] );\n\t} );\n\n\tQUnit.test( 'skipAheadToken()', function ( assert ) {\n\t\tstorage.utterances[ 0 ].tokens = [\n\t\t\t{\n\t\t\t\tstartTime: 0,\n\t\t\t\tendTime: 1000\n\t\t\t},\n\t\t\t{\n\t\t\t\tstartTime: 1000,\n\t\t\t\tendTime: 2000\n\t\t\t}\n\t\t];\n\t\tplayer.currentUtterance = storage.utterances[ 0 ];\n\t\tstorage.getNextToken.returns( storage.utterances[ 0 ].tokens[ 1 ] );\n\n\t\tplayer.skipAheadToken();\n\n\t\tassert.strictEqual(\n\t\t\tstorage.utterances[ 0 ].audio.currentTime,\n\t\t\t1.0\n\t\t);\n\t\tsinon.assert.calledWith(\n\t\t\thighlighter.startTokenHighlighting,\n\t\t\tstorage.utterances[ 0 ].tokens[ 1 ]\n\t\t);\n\t} );\n\n\tQUnit.test( 'skipAheadToken(): skip ahead utterance when last token', function () {\n\t\tstorage.utterances[ 0 ].tokens = [\n\t\t\t{\n\t\t\t\tstartTime: 0,\n\t\t\t\tendTime: 1000\n\t\t\t}\n\t\t];\n\t\tplayer.currentUtterance = storage.utterances[ 0 ];\n\t\tstorage.utterances[ 0 ].audio.currentTime = 0.1;\n\t\tsinon.stub( player, 'skipAheadUtterance' );\n\n\t\tplayer.skipAheadToken();\n\n\t\tsinon.assert.called( player.skipAheadUtterance );\n\t} );\n\n\tQUnit.test( 'skipBackToken()', function ( assert ) {\n\t\tstorage.utterances[ 0 ].tokens = [\n\t\t\t{\n\t\t\t\tstartTime: 0,\n\t\t\t\tendTime: 1000\n\t\t\t},\n\t\t\t{\n\t\t\t\tstartTime: 1000,\n\t\t\t\tendTime: 2000\n\t\t\t}\n\t\t];\n\t\tplayer.currentUtterance = storage.utterances[ 0 ];\n\t\tstorage.utterances[ 0 ].audio.currentTime = 1.1;\n\t\tstorage.getPreviousToken.returns(\n\t\t\tstorage.utterances[ 0 ].tokens[ 0 ]\n\t\t);\n\n\t\tplayer.skipBackToken();\n\n\t\tassert.strictEqual(\n\t\t\tstorage.utterances[ 0 ].audio.currentTime,\n\t\t\t0\n\t\t);\n\t\tsinon.assert.calledWith(\n\t\t\thighlighter.startTokenHighlighting,\n\t\t\tstorage.utterances[ 0 ].tokens[ 0 ]\n\t\t);\n\t} );\n\n\tQUnit.test( 'skipBackToken(): skip to last token in previous utterance if first token', function ( assert ) {\n\t\tvar currentUtterance, previousUtterance;\n\n\t\tpreviousUtterance = storage.utterances[ 0 ];\n\t\tpreviousUtterance.tokens = [\n\t\t\t{\n\t\t\t\tstartTime: 0,\n\t\t\t\tendTime: 1000\n\t\t\t},\n\t\t\t{\n\t\t\t\tstartTime: 1000,\n\t\t\t\tendTime: 2000\n\t\t\t}\n\t\t];\n\t\tcurrentUtterance = storage.utterances[ 1 ];\n\t\tcurrentUtterance.tokens = [\n\t\t\t{\n\t\t\t\tstartTime: 0,\n\t\t\t\tendTime: 1000\n\t\t\t}\n\t\t];\n\t\tplayer.currentUtterance = currentUtterance;\n\t\tstorage.getPreviousToken.returns( null );\n\t\tstorage.getLastToken.returns( previousUtterance.tokens[ 1 ] );\n\t\t// Custom mocking since currentUtterance needs\n\t\t// to change during the skipBackToken.\n\t\tplayer.skipBackUtterance = function () {\n\t\t\tplayer.currentUtterance = previousUtterance;\n\t\t};\n\t\tsinon.spy( player, 'skipBackUtterance' );\n\n\t\tplayer.skipBackToken();\n\n\t\tsinon.assert.calledOnce( player.skipBackUtterance );\n\t\tassert.strictEqual( previousUtterance.audio.currentTime, 1.0 );\n\t} );\n}() );\n","usedDeprecatedRules":[]},{"filePath":"/src/repo/tests/qunit/ext.wikispeech.selectionPlayer.test.js","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/tests/qunit/ext.wikispeech.storage.test.js","messages":[],"suppressedMessages":[{"ruleId":"no-jquery/no-parse-html-literal","severity":2,"message":"Prefer DOM building to parsing HTML literals","line":43,"column":52,"nodeType":"CallExpression","endLine":43,"endColumn":72,"suppressions":[{"kind":"directive","justification":""}]},{"ruleId":"no-jquery/no-parse-html-literal","severity":2,"message":"Prefer DOM building to parsing HTML literals","line":92,"column":52,"nodeType":"CallExpression","endLine":92,"endColumn":72,"suppressions":[{"kind":"directive","justification":""}]},{"ruleId":"no-jquery/no-parse-html-literal","severity":2,"message":"Prefer DOM building to parsing HTML literals","line":122,"column":52,"nodeType":"CallExpression","endLine":122,"endColumn":75,"suppressions":[{"kind":"directive","justification":""}]}],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/tests/qunit/ext.wikispeech.test.util.js","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/tests/qunit/ext.wikispeech.transcriptionPreviewer.test.js","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/tests/qunit/ext.wikispeech.ui.test.js","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]},{"filePath":"/src/repo/tests/qunit/index.js","messages":[],"suppressedMessages":[],"errorCount":0,"fatalErrorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[]}]

--- end ---
$ ./node_modules/.bin/grunt stylelint
--- stdout ---
Running "stylelint:all" (stylelint) task
>> Linted 1 files without errors

Done.

--- end ---
$ /usr/bin/npm ci --legacy-peer-deps
--- stdout ---

added 395 packages, and audited 396 packages in 3s

68 packages are looking for funding
  run `npm fund` for details

found 0 vulnerabilities

--- end ---
$ /usr/bin/npm test
--- stdout ---

> test
> grunt test

Running "eslint:all" (eslint) task

/src/repo/Gruntfile.js
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/composer.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/docs/gadget-template.js
  1:1   error    Definition for rule 'qunit/no-loose-assertions' was not found                                                                            qunit/no-loose-assertions
  8:5   warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  8:18  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  8:36  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals

/src/repo/extension.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/ar.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/br.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/de.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/en.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/es.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/eu.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/fa.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/fr.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/he.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/hi.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/hr.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/ia.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/ig.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/io.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/ko.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/lb.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/mk.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/nb.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/nl.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/pl.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/pt-br.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/pt.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/qqq.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/scn.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/se.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/sh.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/skr-arab.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/sl.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/sma.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/smn.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/sms.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/sv.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/tr.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/uk.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/vec.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/zh-hans.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/api/zh-hant.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/ar.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/arc.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/ary.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/av.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/bn.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/bpy.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/br.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/ca.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/ce.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/cs.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/da.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/dag.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/de.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/diq.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/en.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/eo.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/es.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/eu.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/fa.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/fi.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/fr.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/he.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/hi.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/hr.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/hu.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/hy.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/ia.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/io.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/it.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/ja.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/kaa.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/kn.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/ko.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/ku-latn.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/lb.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/mk.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/mnw.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/mrh.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/ms.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/my.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/myv.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/mzn.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/nb.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/nl.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/om.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/pl.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/pnb.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/pt-br.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/pt.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/qqq.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/rki.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/rmc.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/roa-tara.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/ru.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/scn.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/sco.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/sd.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/se.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/sh.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/skr-arab.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/sl.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/smn.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/sms.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/sq.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/sr-ec.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/sr-el.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/sv.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/te.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/tly.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/tr.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/ug-arab.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/uk.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/ur.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/vec.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/xmf.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/yue.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/zgh.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/zh-hans.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/i18n/zh-hant.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/jsduck.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/modules/ext.wikispeech.gadget.js
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/modules/ext.wikispeech.highlighter.js
    1:1  error    Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions
  250:5  warning  ES2015 'String.prototype.normalize' method is forbidden        es-x/no-string-prototype-normalize
  252:5  warning  ES2015 'String.prototype.normalize' method is forbidden        es-x/no-string-prototype-normalize

/src/repo/modules/ext.wikispeech.loader.js
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/modules/ext.wikispeech.main.js
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/modules/ext.wikispeech.player.js
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/modules/ext.wikispeech.selectionPlayer.js
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/modules/ext.wikispeech.specialEditLexicon.js
  1:1   error    Definition for rule 'qunit/no-loose-assertions' was not found                                                                            qunit/no-loose-assertions
  1:5   warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  1:16  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  1:26  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  1:42  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  1:53  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  1:58  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals
  2:2   warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals

/src/repo/modules/ext.wikispeech.storage.js
     1:1   error    Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions
   998:13  warning  document.evaluate() is not supported in IE 11                  compat/compat
  1002:5   warning  XPathResult is not supported in IE 11                          compat/compat

/src/repo/modules/ext.wikispeech.transcriptionPreviewer.js
   1:1  error    Definition for rule 'qunit/no-loose-assertions' was not found                                                                               qunit/no-loose-assertions
   1:5  warning  Unexpected 'var' declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable     no-implicit-globals
  12:1  warning  Unexpected function declaration in the global scope, wrap in an IIFE for a local variable, assign as global property for a global variable  no-implicit-globals

/src/repo/modules/ext.wikispeech.ui.js
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/modules/ext.wikispeech.userOptionsDialog.js
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/modules/ext.wikispeech.util.js
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/package-lock.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/package.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/sql/abstractSchemaChanges/patch-wikispeech_utterance-wsu_date_stored.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/sql/tables.json
  1:1  error  Definition for rule 'qunit/no-loose-assertions' was not found  qunit/no-loose-assertions

/src/repo/tests/qunit/ext.wikispeech.player.test.js
  172:3  error  Unexpected assert.ok. Use assert.strictEqual, assert.notStrictEqual, assert.deepEqual, or assert.propEqual  qunit/no-loose-assertions
  173:3  error  Unexpected assert.ok. Use assert.strictEqual, assert.notStrictEqual, assert.deepEqual, or assert.propEqual  qunit/no-loose-assertions

✖ 155 problems (139 errors, 16 warnings)

Warning: Task "eslint:all" failed. Use --force to continue.

Aborted due to warnings.

--- end ---
Traceback (most recent call last):
  File "/venv/lib/python3.9/site-packages/runner-0.1.0-py3.9.egg/runner/__init__.py", line 1400, in main
    libup.run(args.repo, args.output, args.branch)
  File "/venv/lib/python3.9/site-packages/runner-0.1.0-py3.9.egg/runner/__init__.py", line 1338, in run
    self.npm_upgrade(plan)
  File "/venv/lib/python3.9/site-packages/runner-0.1.0-py3.9.egg/runner/__init__.py", line 1049, in npm_upgrade
    self.npm_test()
  File "/venv/lib/python3.9/site-packages/runner-0.1.0-py3.9.egg/runner/__init__.py", line 287, in npm_test
    self.check_call(['npm', 'test'])
  File "/venv/lib/python3.9/site-packages/runner-0.1.0-py3.9.egg/runner/shell2.py", line 54, in check_call
    res.check_returncode()
  File "/usr/lib/python3.9/subprocess.py", line 460, in check_returncode
    raise CalledProcessError(self.returncode, self.args, self.stdout,
subprocess.CalledProcessError: Command '['/usr/bin/npm', 'test']' returned non-zero exit status 3.
Source code is licensed under the AGPL.