mediawiki/services/parsoid (master)

sourcepatches
$ date
Thu Mar  4 13:09:11 UTC 2021

$ git clone file:///srv/git/mediawiki-services-parsoid.git repo --depth=1 -b master
Cloning into 'repo'...

$ git config user.name libraryupgrader

$ git config user.email tools.libraryupgrader@tools.wmflabs.org

$ git submodule update --init

$ grr init
Installed commit-msg hook.

$ git show-ref refs/heads/master
28f176ac9247987010a8377251f5c81fe417620d refs/heads/master

$ composer install
Loading composer repositories with package information
Warning from https://repo.packagist.org: You are using an outdated version of Composer. Composer 2 is now available and you should upgrade. See https://getcomposer.org/2
Updating dependencies (including require-dev)
Package operations: 79 installs, 0 updates, 0 removals
  - Installing liuggio/statsd-php-client (v1.0.18): Loading from cache
  - Installing psr/container (1.0.0): Loading from cache
  - Installing wikimedia/object-factory (v3.0.0): Loading from cache
  - Installing wikimedia/scoped-callback (v3.0.0): Loading from cache
  - Installing wikimedia/wikipeg (2.0.5): Loading from cache
  - Installing php-parallel-lint/php-console-color (v0.3): Loading from cache
  - Installing php-parallel-lint/php-parallel-lint (v1.2.0): Loading from cache
  - Installing squizlabs/php_codesniffer (3.5.8): Loading from cache
  - Installing composer/spdx-licenses (1.5.5): Loading from cache
  - Installing composer/semver (1.7.2): Loading from cache
  - Installing mediawiki/mediawiki-codesniffer (v34.0.0): Loading from cache
  - Installing symfony/polyfill-php80 (v1.22.1): Loading from cache
  - Installing symfony/polyfill-mbstring (v1.22.1): Loading from cache
  - Installing symfony/polyfill-intl-normalizer (v1.22.1): Loading from cache
  - Installing symfony/polyfill-intl-grapheme (v1.22.1): Loading from cache
  - Installing symfony/polyfill-ctype (v1.22.1): Loading from cache
  - Installing symfony/string (v5.2.3): Loading from cache
  - Installing symfony/service-contracts (v2.2.0): Loading from cache
  - Installing symfony/polyfill-php73 (v1.22.1): Loading from cache
  - Installing symfony/console (v5.2.3): Loading from cache
  - Installing psr/log (1.1.3): Loading from cache
  - Installing sabre/event (5.1.2): Loading from cache
  - Installing netresearch/jsonmapper (v2.1.0): Loading from cache
  - Installing microsoft/tolerant-php-parser (v0.0.23): Loading from cache
  - Installing phpdocumentor/reflection-common (2.2.0): Loading from cache
  - Installing webmozart/assert (1.9.1): Loading from cache
  - Installing phpdocumentor/type-resolver (1.4.0): Loading from cache
  - Installing phpdocumentor/reflection-docblock (5.2.2): Loading from cache
  - Installing felixfbecker/advanced-json-rpc (v3.2.0): Loading from cache
  - Installing composer/xdebug-handler (1.4.5): Loading from cache
  - Installing phan/phan (3.2.6): Loading from cache
  - Installing mediawiki/phan-taint-check-plugin (3.2.1): Loading from cache
  - Installing mediawiki/mediawiki-phan-config (0.10.6): Loading from cache
  - Installing mediawiki/minus-x (1.1.0): Loading from cache
  - Installing wikimedia/alea (0.9.2): Loading from cache
  - Installing wikimedia/assert (v0.5.0): Loading from cache
  - Installing wikimedia/langconv (0.3.5): Loading from cache
  - Installing wikimedia/testing-access-wrapper (1.0.0): Loading from cache
  - Installing sebastian/version (2.0.1): Loading from cache
  - Installing sebastian/type (1.1.4): Loading from cache
  - Installing sebastian/resource-operations (2.0.2): Loading from cache
  - Installing sebastian/recursion-context (3.0.1): Loading from cache
  - Installing sebastian/object-reflector (1.1.2): Loading from cache
  - Installing sebastian/object-enumerator (3.0.4): Loading from cache
  - Installing sebastian/global-state (3.0.1): Loading from cache
  - Installing sebastian/exporter (3.1.3): Loading from cache
  - Installing sebastian/environment (4.2.4): Loading from cache
  - Installing sebastian/diff (3.0.3): Loading from cache
  - Installing sebastian/comparator (3.0.3): Loading from cache
  - Installing phpunit/php-timer (2.1.3): Loading from cache
  - Installing phpunit/php-text-template (1.2.1): Loading from cache
  - Installing phpunit/php-file-iterator (2.0.3): Loading from cache
  - Installing theseer/tokenizer (1.2.0): Loading from cache
  - Installing sebastian/code-unit-reverse-lookup (1.0.2): Loading from cache
  - Installing phpunit/php-token-stream (4.0.4): Loading from cache
  - Installing phpunit/php-code-coverage (7.0.14): Loading from cache
  - Installing doctrine/instantiator (1.4.0): Loading from cache
  - Installing phpspec/prophecy (1.12.2): Loading from cache
  - Installing myclabs/deep-copy (1.10.2): Loading from cache
  - Installing phar-io/version (3.1.0): Loading from cache
  - Installing phar-io/manifest (2.0.1): Loading from cache
  - Installing phpunit/phpunit (8.5.14): Loading from cache
  - Installing ockcyp/covers-validator (v1.3.1): Loading from cache
  - Installing wikimedia/at-ease (v2.0.0): Loading from cache
  - Installing wikimedia/ip-set (2.1.0): Loading from cache
  - Installing wikimedia/base-convert (v2.0.1): Loading from cache
  - Installing wikimedia/ip-utils (3.0.1): Loading from cache
  - Installing wikimedia/utfnormal (3.0.1): Loading from cache
  - Installing wikimedia/remex-html (2.3.0): Loading from cache
  - Installing wikimedia/zest-css (1.1.4): Loading from cache
  - Installing monolog/monolog (2.2.0): Loading from cache
  - Installing symfony/process (v5.2.3): Loading from cache
  - Installing symfony/finder (v5.2.3): Loading from cache
  - Installing symfony/filesystem (v5.2.3): Loading from cache
  - Installing seld/phar-utils (1.1.1): Loading from cache
  - Installing seld/jsonlint (1.8.3): Loading from cache
  - Installing justinrainbow/json-schema (5.2.10): Loading from cache
  - Installing composer/ca-bundle (1.2.9): Loading from cache
  - Installing composer/composer (1.10.20): Loading from cache
php-parallel-lint/php-parallel-lint suggests installing php-parallel-lint/php-console-highlighter (Highlight syntax in code snippet)
symfony/service-contracts suggests installing symfony/service-implementation
symfony/console suggests installing symfony/event-dispatcher
symfony/console suggests installing symfony/lock
phan/phan suggests installing ext-ast (Needed for parsing ASTs (unless --use-fallback-parser is used). 1.0.1+ is needed, 1.0.8+ is recommended.)
sebastian/global-state suggests installing ext-uopz (*)
phpunit/php-code-coverage suggests installing ext-xdebug (^2.7.2)
phpunit/phpunit suggests installing phpunit/php-invoker (^2.0.0)
phpunit/phpunit suggests installing ext-soap (*)
phpunit/phpunit suggests installing ext-xdebug (*)
monolog/monolog suggests installing graylog2/gelf-php (Allow sending log messages to a GrayLog2 server)
monolog/monolog suggests installing doctrine/couchdb (Allow sending log messages to a CouchDB server)
monolog/monolog suggests installing ruflin/elastica (Allow sending log messages to an Elastic Search server)
monolog/monolog suggests installing elasticsearch/elasticsearch (Allow sending log messages to an Elasticsearch server via official client)
monolog/monolog suggests installing php-amqplib/php-amqplib (Allow sending log messages to an AMQP server using php-amqplib)
monolog/monolog suggests installing ext-amqp (Allow sending log messages to an AMQP server (1.0+ required))
monolog/monolog suggests installing ext-mongodb (Allow sending log messages to a MongoDB server (via driver))
monolog/monolog suggests installing mongodb/mongodb (Allow sending log messages to a MongoDB server (via library))
monolog/monolog suggests installing aws/aws-sdk-php (Allow sending log messages to AWS services like DynamoDB)
monolog/monolog suggests installing rollbar/rollbar (Allow sending log messages to Rollbar)
monolog/monolog suggests installing php-console/php-console (Allow sending log messages to Google Chrome)
Package phpunit/php-token-stream is abandoned, you should avoid using it. No replacement was suggested.
Writing lock file
Generating optimized autoload files
38 packages you are using are looking for funding.
Use the `composer fund` command to find out more!

Upgrading n:eslint from 6.8.0 -> 7.16.0
Upgrading n:eslint-config-wikimedia from 0.15.3 -> 0.18.1
$ npm install

> dtrace-provider@0.8.8 install /src/repo/node_modules/dtrace-provider
> node-gyp rebuild || node suppress-error.js

make: Entering directory '/src/repo/node_modules/dtrace-provider/build'
  TOUCH Release/obj.target/DTraceProviderStub.stamp
make: Leaving directory '/src/repo/node_modules/dtrace-provider/build'

> heapdump@0.3.15 install /src/repo/node_modules/heapdump
> node-gyp rebuild

make: Entering directory '/src/repo/node_modules/heapdump/build'
  CXX(target) Release/obj.target/addon/src/heapdump.o
In file included from ../src/heapdump.cc:17:
../../nan/nan.h: In function ‘void Nan::AsyncQueueWorker(Nan::AsyncWorker*)’:
../../nan/nan.h:2294:62: warning: cast between incompatible function types from ‘void (*)(uv_work_t*)’ {aka ‘void (*)(uv_work_s*)’} to ‘uv_after_work_cb’ {aka ‘void (*)(uv_work_s*, int)’} [-Wcast-function-type]
     , reinterpret_cast<uv_after_work_cb>(AsyncExecuteComplete)
                                                              ^
In file included from ../src/heapdump.cc:15:
../src/heapdump.cc: At global scope:
/cache/node-gyp/10.21.0/include/node/node.h:573:43: warning: cast between incompatible function types from ‘void (*)(Nan::ADDON_REGISTER_FUNCTION_ARGS_TYPE)’ {aka ‘void (*)(v8::Local<v8::Object>)’} to ‘node::addon_register_func’ {aka ‘void (*)(v8::Local<v8::Object>, v8::Local<v8::Value>, void*)’} [-Wcast-function-type]
       (node::addon_register_func) (regfunc),                          \
                                           ^
/cache/node-gyp/10.21.0/include/node/node.h:607:3: note: in expansion of macro ‘NODE_MODULE_X’
   NODE_MODULE_X(modname, regfunc, NULL, 0)  // NOLINT (readability/null_usage)
   ^~~~~~~~~~~~~
../src/heapdump.cc:136:1: note: in expansion of macro ‘NODE_MODULE’
 NODE_MODULE(addon, Initialize)
 ^~~~~~~~~~~
In file included from /cache/node-gyp/10.21.0/include/node/node.h:63,
                 from ../src/heapdump.cc:15:
/cache/node-gyp/10.21.0/include/node/v8.h: In instantiation of ‘void v8::PersistentBase<T>::SetWeak(P*, typename v8::WeakCallbackInfo<P>::Callback, v8::WeakCallbackType) [with P = node::ObjectWrap; T = v8::Object; typename v8::WeakCallbackInfo<P>::Callback = void (*)(const v8::WeakCallbackInfo<node::ObjectWrap>&)]’:
/cache/node-gyp/10.21.0/include/node/node_object_wrap.h:84:78:   required from here
/cache/node-gyp/10.21.0/include/node/v8.h:9502:16: warning: cast between incompatible function types from ‘v8::WeakCallbackInfo<node::ObjectWrap>::Callback’ {aka ‘void (*)(const v8::WeakCallbackInfo<node::ObjectWrap>&)’} to ‘Callback’ {aka ‘void (*)(const v8::WeakCallbackInfo<void>&)’} [-Wcast-function-type]
                reinterpret_cast<Callback>(callback), type);
                ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/cache/node-gyp/10.21.0/include/node/v8.h: In instantiation of ‘void v8::PersistentBase<T>::SetWeak(P*, typename v8::WeakCallbackInfo<P>::Callback, v8::WeakCallbackType) [with P = Nan::ObjectWrap; T = v8::Object; typename v8::WeakCallbackInfo<P>::Callback = void (*)(const v8::WeakCallbackInfo<Nan::ObjectWrap>&)]’:
../../nan/nan_object_wrap.h:65:61:   required from here
/cache/node-gyp/10.21.0/include/node/v8.h:9502:16: warning: cast between incompatible function types from ‘v8::WeakCallbackInfo<Nan::ObjectWrap>::Callback’ {aka ‘void (*)(const v8::WeakCallbackInfo<Nan::ObjectWrap>&)’} to ‘Callback’ {aka ‘void (*)(const v8::WeakCallbackInfo<void>&)’} [-Wcast-function-type]
  SOLINK_MODULE(target) Release/obj.target/addon.node
  COPY Release/addon.node
make: Leaving directory '/src/repo/node_modules/heapdump/build'

> unix-dgram@2.0.4 install /src/repo/node_modules/unix-dgram
> node-gyp rebuild

make: Entering directory '/src/repo/node_modules/unix-dgram/build'
  CXX(target) Release/obj.target/unix_dgram/src/unix_dgram.o
In file included from ../src/unix_dgram.cc:5:
../../nan/nan.h: In function ‘void Nan::AsyncQueueWorker(Nan::AsyncWorker*)’:
../../nan/nan.h:2294:62: warning: cast between incompatible function types from ‘void (*)(uv_work_t*)’ {aka ‘void (*)(uv_work_s*)’} to ‘uv_after_work_cb’ {aka ‘void (*)(uv_work_s*, int)’} [-Wcast-function-type]
     , reinterpret_cast<uv_after_work_cb>(AsyncExecuteComplete)
                                                              ^
In file included from ../../nan/nan.h:56,
                 from ../src/unix_dgram.cc:5:
../src/unix_dgram.cc: At global scope:
/cache/node-gyp/10.21.0/include/node/node.h:573:43: warning: cast between incompatible function types from ‘void (*)(v8::Local<v8::Object>)’ to ‘node::addon_register_func’ {aka ‘void (*)(v8::Local<v8::Object>, v8::Local<v8::Value>, void*)’} [-Wcast-function-type]
       (node::addon_register_func) (regfunc),                          \
                                           ^
/cache/node-gyp/10.21.0/include/node/node.h:607:3: note: in expansion of macro ‘NODE_MODULE_X’
   NODE_MODULE_X(modname, regfunc, NULL, 0)  // NOLINT (readability/null_usage)
   ^~~~~~~~~~~~~
../src/unix_dgram.cc:404:1: note: in expansion of macro ‘NODE_MODULE’
 NODE_MODULE(unix_dgram, Initialize)
 ^~~~~~~~~~~
In file included from /cache/node-gyp/10.21.0/include/node/node.h:63,
                 from ../../nan/nan.h:56,
                 from ../src/unix_dgram.cc:5:
/cache/node-gyp/10.21.0/include/node/v8.h: In instantiation of ‘void v8::PersistentBase<T>::SetWeak(P*, typename v8::WeakCallbackInfo<P>::Callback, v8::WeakCallbackType) [with P = node::ObjectWrap; T = v8::Object; typename v8::WeakCallbackInfo<P>::Callback = void (*)(const v8::WeakCallbackInfo<node::ObjectWrap>&)]’:
/cache/node-gyp/10.21.0/include/node/node_object_wrap.h:84:78:   required from here
/cache/node-gyp/10.21.0/include/node/v8.h:9502:16: warning: cast between incompatible function types from ‘v8::WeakCallbackInfo<node::ObjectWrap>::Callback’ {aka ‘void (*)(const v8::WeakCallbackInfo<node::ObjectWrap>&)’} to ‘Callback’ {aka ‘void (*)(const v8::WeakCallbackInfo<void>&)’} [-Wcast-function-type]
                reinterpret_cast<Callback>(callback), type);
                ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/cache/node-gyp/10.21.0/include/node/v8.h: In instantiation of ‘void v8::PersistentBase<T>::SetWeak(P*, typename v8::WeakCallbackInfo<P>::Callback, v8::WeakCallbackType) [with P = Nan::ObjectWrap; T = v8::Object; typename v8::WeakCallbackInfo<P>::Callback = void (*)(const v8::WeakCallbackInfo<Nan::ObjectWrap>&)]’:
../../nan/nan_object_wrap.h:65:61:   required from here
/cache/node-gyp/10.21.0/include/node/v8.h:9502:16: warning: cast between incompatible function types from ‘v8::WeakCallbackInfo<Nan::ObjectWrap>::Callback’ {aka ‘void (*)(const v8::WeakCallbackInfo<Nan::ObjectWrap>&)’} to ‘Callback’ {aka ‘void (*)(const v8::WeakCallbackInfo<void>&)’} [-Wcast-function-type]
  SOLINK_MODULE(target) Release/obj.target/unix_dgram.node
  COPY Release/unix_dgram.node
make: Leaving directory '/src/repo/node_modules/unix-dgram/build'

> gc-stats@git+https://github.com/dainis/node-gcstats.git#5be60dfd24293d6cefbc8a459c1537611373fac5 install /src/repo/node_modules/gc-stats
> node-pre-gyp install --fallback-to-build

node-pre-gyp WARN Using request for node-pre-gyp https download 
[gc-stats] Success: "/src/repo/node_modules/gc-stats/build/gcstats/v1.5.0/Release/node-v64-linux-x64/gcstats.node" is installed via remote

> core-js@2.6.11 postinstall /src/repo/node_modules/core-js
> node -e "try{require('./postinstall')}catch(e){}"

Thank you for using core-js ( https://github.com/zloirock/core-js ) for polyfilling JavaScript standard library!

The project needs your help! Please consider supporting of core-js on Open Collective or Patreon: 
> https://opencollective.com/core-js 
> https://www.patreon.com/zloirock 

Also, the author of core-js ( https://github.com/zloirock ) is looking for a good job -)


> core-js@3.9.1 postinstall /src/repo/node_modules/eslint-plugin-compat/node_modules/core-js
> node -e "try{require('./postinstall')}catch(e){}"

npm WARN optional SKIPPING OPTIONAL DEPENDENCY: fsevents@2.1.3 (node_modules/fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for fsevents@2.1.3: wanted {"os":"darwin","arch":"any"} (current: {"os":"linux","arch":"x64"})

added 491 packages from 354 contributors and audited 493 packages in 183.243s

48 packages are looking for funding
  run `npm fund` for details

found 0 vulnerabilities


$ npm update eslint -depth 10

$ ./node_modules/.bin/eslint . --fix

/src/repo/bin/benchmark.js
   64:22  error  'config' is already declared in the upper scope   no-shadow
   84:4   error  Don't use process.exit(); throw an error instead  no-process-exit
  136:8   error  'config' is already declared in the upper scope   no-shadow
  222:47  error  'config' is already declared in the upper scope   no-shadow
  233:16  error  'config' is already declared in the upper scope   no-shadow
  349:4   error  Don't use process.exit(); throw an error instead  no-process-exit

/src/repo/bin/domdiff.test.js
   7:23  error  "../lib/html2wt/DOMDiff.js" is not found          node/no-missing-require
  10:29  error  "../lib/logger/ParsoidLogger.js" is not found     node/no-missing-require
  88:2   error  Don't use process.exit(); throw an error instead  no-process-exit

/src/repo/bin/inspectTokenizer.js
  211:2  error  Don't use process.exit(); throw an error instead  no-process-exit

/src/repo/bin/langconv-test.js
   12:51  error    "../lib/mw/ApiRequest.js" is not found               node/no-missing-require
   17:41  error    "../lib/config/MWParserEnvironment.js" is not found  node/no-missing-require
   20:37  error    "../lib/mw/ApiRequest.js" is not found               node/no-missing-require
  208:1   warning  Missing JSDoc @return declaration                    jsdoc/require-returns
  211:0   warning  Missing JSDoc @param "env" type                      jsdoc/require-param-type
  212:0   warning  Missing JSDoc @param "document" type                 jsdoc/require-param-type
  223:1   warning  Missing JSDoc @return declaration                    jsdoc/require-returns
  228:0   warning  Missing JSDoc @param "env" type                      jsdoc/require-param-type
  229:0   warning  Missing JSDoc @param "document" type                 jsdoc/require-param-type
  246:32  error    'env' is already declared in the upper scope         no-shadow
  613:4   error    Don't use process.exit(); throw an error instead     no-process-exit

/src/repo/bin/normalize.test.js
  66:2  error  Don't use process.exit(); throw an error instead  no-process-exit

/src/repo/bin/roundtrip-test.js
  926:4  error  Don't use process.exit(); throw an error instead  no-process-exit

/src/repo/core-upgrade.js
  34:17  error  'msg' is already declared in the upper scope  no-shadow

/src/repo/lib/config/ParsoidConfig.js
  522:20  error  'url.parse' was deprecated since v11.0.0. Use 'url.URL' constructor instead  node/no-deprecated-api

/src/repo/lib/html2wt/DOMNormalizer.js
   53:1  warning  Missing JSDoc @return declaration                    jsdoc/require-returns
   56:0  warning  Missing JSDoc @param "a" type                        jsdoc/require-param-type
   57:0  warning  Missing JSDoc @param "b" type                        jsdoc/require-param-type
   58:0  warning  Duplicate @param "a"                                 jsdoc/check-param-names
   58:0  warning  Missing JSDoc @param "a" type                        jsdoc/require-param-type
   59:0  warning  Missing JSDoc @param "b" type                        jsdoc/require-param-type
   65:1  warning  Missing JSDoc @return declaration                    jsdoc/require-returns
   71:0  warning  Missing JSDoc @param "a" type                        jsdoc/require-param-type
   72:0  warning  Missing JSDoc @param "b" type                        jsdoc/require-param-type
   73:0  warning  Duplicate @param "a"                                 jsdoc/check-param-names
   73:0  warning  Missing JSDoc @param "a" type                        jsdoc/require-param-type
   74:0  warning  Missing JSDoc @param "b" type                        jsdoc/require-param-type
  145:0  warning  The type 'SerializerState' is undefined              jsdoc/no-undefined-types
  191:2  warning  Missing JSDoc @return declaration                    jsdoc/require-returns
  194:0  warning  Missing JSDoc @param "a" type                        jsdoc/require-param-type
  195:0  warning  Missing JSDoc @param "b" type                        jsdoc/require-param-type
  196:0  warning  Missing JSDoc @param "a" type                        jsdoc/require-param-type
  196:0  warning  Duplicate @param "a"                                 jsdoc/check-param-names
  197:0  warning  Missing JSDoc @param "b" type                        jsdoc/require-param-type
  237:2  warning  Missing JSDoc @return declaration                    jsdoc/require-returns
  240:0  warning  Missing JSDoc @param "a" type                        jsdoc/require-param-type
  241:0  warning  Missing JSDoc @param "b" type                        jsdoc/require-param-type
  413:7  error    'firstChild' is already declared in the upper scope  no-shadow
  467:0  warning  The type 'Node' is undefined                         jsdoc/no-undefined-types
  468:0  warning  The type 'Node' is undefined                         jsdoc/no-undefined-types

/src/repo/lib/html2wt/DiffUtils.js
   14:0  warning  The type 'Node' is undefined                 jsdoc/no-undefined-types
   15:0  warning  The type 'MWParserEnvironment' is undefined  jsdoc/no-undefined-types
   25:2  warning  Missing JSDoc @return declaration            jsdoc/require-returns
   28:0  warning  The type 'Node' is undefined                 jsdoc/no-undefined-types
   29:0  warning  The type 'MWParserEnvironment' is undefined  jsdoc/no-undefined-types
   54:2  warning  Missing JSDoc @return declaration            jsdoc/require-returns
   58:0  warning  Missing JSDoc @param "node" type             jsdoc/require-param-type
   91:0  warning  The type 'Node' is undefined                 jsdoc/no-undefined-types
   92:0  warning  The type 'MWParserEnvironment' is undefined  jsdoc/no-undefined-types
  117:0  warning  The type 'Node' is undefined                 jsdoc/no-undefined-types
  119:0  warning  The type 'Element' is undefined              jsdoc/no-undefined-types
  128:2  warning  Missing JSDoc @return declaration            jsdoc/require-returns
  131:0  warning  The type 'Node' is undefined                 jsdoc/no-undefined-types
  132:0  warning  The type 'Node' is undefined                 jsdoc/no-undefined-types

/src/repo/lib/html2wt/WTSUtils.js
   22:2  warning  Found more than one @return declaration      jsdoc/require-returns
   22:2  warning  Found more than one @return declaration      jsdoc/require-returns-check
   27:0  warning  Missing JSDoc @param "node" type             jsdoc/require-param-type
   28:0  warning  Missing JSDoc @param "name" type             jsdoc/require-param-type
   29:0  warning  Missing JSDoc @param "curVal" type           jsdoc/require-param-type
   67:2  warning  Found more than one @return declaration      jsdoc/require-returns
   67:2  warning  Found more than one @return declaration      jsdoc/require-returns-check
   70:0  warning  The type 'Node' is undefined                 jsdoc/no-undefined-types
   85:2  warning  Missing JSDoc @return declaration            jsdoc/require-returns
   89:0  warning  Missing JSDoc @param "src" type              jsdoc/require-param-type
   90:0  warning  Missing JSDoc @param "node" type             jsdoc/require-param-type
   91:0  warning  Missing JSDoc @param "state" type            jsdoc/require-param-type
   92:0  warning  Missing JSDoc @param "dontEmit" type         jsdoc/require-param-type
  106:2  warning  Missing JSDoc @return declaration            jsdoc/require-returns
  110:0  warning  Missing JSDoc @param "src" type              jsdoc/require-param-type
  111:0  warning  Missing JSDoc @param "node" type             jsdoc/require-param-type
  112:0  warning  Missing JSDoc @param "state" type            jsdoc/require-param-type
  113:0  warning  Missing JSDoc @param "dontEmit" type         jsdoc/require-param-type
  127:2  warning  Missing JSDoc @return declaration            jsdoc/require-returns
  133:0  warning  Missing JSDoc @param "origNode" type         jsdoc/require-param-type
  134:0  warning  Missing JSDoc @param "before" type           jsdoc/require-param-type
  169:2  warning  Missing JSDoc @return declaration            jsdoc/require-returns
  172:0  warning  Missing JSDoc @param "node" type             jsdoc/require-param-type
  173:0  warning  Missing JSDoc @param "sepNode" type          jsdoc/require-param-type
  209:0  warning  The type 'MWParserEnvironment' is undefined  jsdoc/no-undefined-types
  210:0  warning  The type 'Node' is undefined                 jsdoc/no-undefined-types
  289:0  warning  The type 'Node' is undefined                 jsdoc/no-undefined-types

/src/repo/lib/utils/ContentUtils.js
   23:0  warning  The type 'Node' is undefined                 jsdoc/no-undefined-types
   35:0  warning  The type 'Node' is undefined                 jsdoc/no-undefined-types
   49:0  warning  The type 'MWParserEnvironment' is undefined  jsdoc/no-undefined-types
   52:0  warning  The type 'Node' is undefined                 jsdoc/no-undefined-types
  104:0  warning  The type 'Node' is undefined                 jsdoc/no-undefined-types

/src/repo/lib/utils/DOMDataUtils.js
  116:2  warning  Missing JSDoc @return declaration  jsdoc/require-returns
  119:0  warning  The type 'Node' is undefined       jsdoc/no-undefined-types
  143:0  warning  The type 'Node' is undefined       jsdoc/no-undefined-types
  183:2  warning  Missing JSDoc @return declaration  jsdoc/require-returns
  186:0  warning  Missing JSDoc @param "node" type   jsdoc/require-param-type
  187:0  warning  Missing JSDoc @param "type" type   jsdoc/require-param-type
  188:0  warning  Duplicate @param "node"            jsdoc/check-param-names
  188:0  warning  Missing JSDoc @param "node" type   jsdoc/require-param-type
  189:0  warning  Missing JSDoc @param "type" type   jsdoc/require-param-type
  207:0  warning  Missing JSDoc @param "node" type   jsdoc/require-param-type
  208:0  warning  Missing JSDoc @param "type" type   jsdoc/require-param-type
  232:0  warning  Missing JSDoc @param "node" type   jsdoc/require-param-type
  233:0  warning  Missing JSDoc @param "type" type   jsdoc/require-param-type
  234:0  warning  Duplicate @param "node"            jsdoc/check-param-names
  234:0  warning  Missing JSDoc @param "node" type   jsdoc/require-param-type
  235:0  warning  Missing JSDoc @param "type" type   jsdoc/require-param-type
  265:0  warning  Missing JSDoc @param "node" type   jsdoc/require-param-type
  266:0  warning  Missing JSDoc @param "env" type    jsdoc/require-param-type
  267:0  warning  Missing JSDoc @param "data" type   jsdoc/require-param-type
  294:0  warning  The type 'Document' is undefined   jsdoc/no-undefined-types
  309:0  warning  The type 'Document' is undefined   jsdoc/no-undefined-types
  327:0  warning  Missing JSDoc @param "doc" type    jsdoc/require-param-type
  328:0  warning  Missing JSDoc @param "pb" type     jsdoc/require-param-type
  384:0  warning  The type 'Node' is undefined       jsdoc/no-undefined-types

/src/repo/lib/utils/DOMTraverser.js
  30:0  warning  The type 'Node' is undefined                                                       jsdoc/no-undefined-types
  31:0  warning  The type 'MWParserEnvironment' is undefined                                        jsdoc/no-undefined-types
  34:0  warning  The type 'true' is undefined                                                       jsdoc/no-undefined-types
  34:0  warning  The type 'false' is undefined                                                      jsdoc/no-undefined-types
  34:0  warning  The type 'Node' is undefined                                                       jsdoc/no-undefined-types
  44:0  warning  The type 'traverserHandler' is undefined                                           jsdoc/no-undefined-types
  51:1  warning  Missing JSDoc @return declaration                                                  jsdoc/require-returns
  52:0  warning  Missing JSDoc @param "node" type                                                   jsdoc/require-param-type
  53:0  warning  Missing JSDoc @param "env" type                                                    jsdoc/require-param-type
  54:0  warning  Missing JSDoc @param "atTopLevel" type                                             jsdoc/require-param-type
  55:0  warning  Missing JSDoc @param "tplInfo" type                                                jsdoc/require-param-type
  80:1  warning  JSDoc @return declaration present but return expression not available in function  jsdoc/require-returns-check
  91:0  warning  The type 'Node' is undefined                                                       jsdoc/no-undefined-types
  92:0  warning  The type 'MWParserEnvironment' is undefined                                        jsdoc/no-undefined-types
  96:0  warning  The type 'true' is undefined                                                       jsdoc/no-undefined-types
  96:0  warning  The type 'Node' is undefined                                                       jsdoc/no-undefined-types

/src/repo/lib/utils/DOMUtils.js
   23:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
   42:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
   43:0  warning  Missing JSDoc @param "handler" type           jsdoc/require-param-type
   60:0  warning  Missing JSDoc @param "from" type              jsdoc/require-param-type
   61:0  warning  Missing JSDoc @param "to" type                jsdoc/require-param-type
   62:0  warning  Missing JSDoc @param "beforeNode" type        jsdoc/require-param-type
   79:0  warning  Missing JSDoc @param "from" type              jsdoc/require-param-type
   80:0  warning  Missing JSDoc @param "to" type                jsdoc/require-param-type
   81:0  warning  Missing JSDoc @param "beforeNode" type        jsdoc/require-param-type
   95:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
   99:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  105:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  109:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  115:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  119:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  125:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  129:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  147:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  155:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
  156:0  warning  Missing JSDoc @param "nchildren" type         jsdoc/require-param-type
  157:0  warning  Missing JSDoc @param "countDiffMarkers" type  jsdoc/require-param-type
  174:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  175:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  176:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  191:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
  192:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  201:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  202:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  204:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  216:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  220:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  221:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  230:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  234:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  235:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  244:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  247:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  261:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  275:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  278:0  warning  The type 'bool' is undefined                  jsdoc/no-undefined-types
  289:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  306:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  309:0  warning  The type 'bool' is undefined                  jsdoc/no-undefined-types
  351:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  353:0  warning  The type 'bool' is undefined                  jsdoc/no-undefined-types
  371:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  374:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
  386:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  389:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
  405:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  408:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
  429:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  433:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
  443:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  447:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
  485:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  488:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
  498:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  501:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
  511:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  514:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
  524:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  527:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
  537:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  540:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
  555:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  558:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
  571:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  575:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
  576:0  warning  Missing JSDoc @param "strict" type            jsdoc/require-param-type
  594:2  warning  Missing JSDoc @return declaration             jsdoc/require-returns
  598:0  warning  Missing JSDoc @param "node" type              jsdoc/require-param-type
  599:0  warning  Missing JSDoc @param "tagName" type           jsdoc/require-param-type
  618:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  628:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  629:0  warning  The type 'Node' is undefined                  jsdoc/no-undefined-types
  639:0  warning  The type 'Document' is undefined              jsdoc/no-undefined-types
  652:0  warning  The type 'Document' is undefined              jsdoc/no-undefined-types

/src/repo/lib/utils/Diff.js
   16:1  warning  Missing JSDoc @return declaration       jsdoc/require-returns
   17:0  warning  Missing JSDoc @param "diff" type        jsdoc/require-param-type
   18:0  warning  Missing JSDoc @param "srcLengths" type  jsdoc/require-param-type
   19:0  warning  Missing JSDoc @param "outLengths" type  jsdoc/require-param-type
   20:0  warning  Missing JSDoc @param "diff" type        jsdoc/require-param-type
   20:0  warning  Duplicate @param "diff"                 jsdoc/check-param-names
   21:0  warning  Missing JSDoc @param "srcLengths" type  jsdoc/require-param-type
   22:0  warning  Missing JSDoc @param "outLengths" type  jsdoc/require-param-type
   23:0  warning  Missing JSDoc @param "diff" type        jsdoc/require-param-type
   24:0  warning  Missing JSDoc @param "srcLengths" type  jsdoc/require-param-type
   25:0  warning  Missing JSDoc @param "outLengths" type  jsdoc/require-param-type
  104:1  warning  Missing JSDoc @return declaration       jsdoc/require-returns
  105:0  warning  Missing JSDoc @param "changes" type     jsdoc/require-param-type
  137:1  warning  Missing JSDoc @return declaration       jsdoc/require-returns
  138:0  warning  Missing JSDoc @param "oldString" type   jsdoc/require-param-type
  139:0  warning  Missing JSDoc @param "newString" type   jsdoc/require-param-type
  161:1  warning  Missing JSDoc @return declaration       jsdoc/require-returns
  162:0  warning  Missing JSDoc @param "oldString" type   jsdoc/require-param-type
  163:0  warning  Missing JSDoc @param "newString" type   jsdoc/require-param-type
  175:1  warning  Missing JSDoc @return declaration       jsdoc/require-returns
  176:0  warning  Missing JSDoc @param "a" type           jsdoc/require-param-type
  177:0  warning  Missing JSDoc @param "b" type           jsdoc/require-param-type
  178:0  warning  Missing JSDoc @param "options" type     jsdoc/require-param-type
  237:1  warning  Missing JSDoc @return declaration       jsdoc/require-returns
  241:0  warning  Missing JSDoc @param "diff" type        jsdoc/require-param-type
  320:1  warning  Missing JSDoc @return declaration       jsdoc/require-returns
  321:0  warning  Missing JSDoc @param "a" type           jsdoc/require-param-type
  322:0  warning  Missing JSDoc @param "b" type           jsdoc/require-param-type

/src/repo/lib/utils/TokenUtils.js
  12:2  warning  Missing JSDoc @return declaration  jsdoc/require-returns
  18:0  warning  Missing JSDoc @param "name" type   jsdoc/require-param-type

/src/repo/lib/utils/Util.js
   59:2  warning  Missing JSDoc @return declaration       jsdoc/require-returns
   62:0  warning  Missing JSDoc @param "name" type        jsdoc/require-param-type
  129:2  warning  Missing JSDoc @return declaration       jsdoc/require-returns
  134:0  warning  Missing JSDoc @param "str" type         jsdoc/require-param-type
  135:0  warning  Missing JSDoc @param "idx" type         jsdoc/require-param-type
  136:0  warning  Missing JSDoc @param "str" type         jsdoc/require-param-type
  136:0  warning  Duplicate @param "str"                  jsdoc/check-param-names
  137:0  warning  Missing JSDoc @param "idx" type         jsdoc/require-param-type
  160:2  warning  Missing JSDoc @return declaration       jsdoc/require-returns
  167:0  warning  Missing JSDoc @param "str" type         jsdoc/require-param-type
  173:2  warning  Missing JSDoc @return declaration       jsdoc/require-returns
  183:0  warning  Missing JSDoc @param "txt" type         jsdoc/require-param-type
  260:2  warning  Missing JSDoc @return declaration       jsdoc/require-returns
  298:0  warning  The type 'Title' is undefined           jsdoc/no-undefined-types
  299:0  warning  The type 'Title' is undefined           jsdoc/no-undefined-types
  313:0  warning  The type 'Namespace' is undefined       jsdoc/no-undefined-types
  314:0  warning  The type 'Namespace' is undefined       jsdoc/no-undefined-types
  432:2  warning  Missing JSDoc @return declaration       jsdoc/require-returns
  436:0  warning  Missing JSDoc @param "linkTarget" type  jsdoc/require-param-type
  437:0  warning  Missing JSDoc @param "env" type         jsdoc/require-param-type

/src/repo/lib/utils/WTUtils.js
   28:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
   40:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
   43:0  warning  Missing JSDoc @param "node" type    jsdoc/require-param-type
   56:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
   61:0  warning  The type 'Node' is undefined        jsdoc/no-undefined-types
   67:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
   72:0  warning  Missing JSDoc @param "aNode" type   jsdoc/require-param-type
   73:0  warning  Missing JSDoc @param "dp" type      jsdoc/require-param-type
   74:0  warning  Missing JSDoc @param "aNode" type   jsdoc/require-param-type
   74:0  warning  Duplicate @param "aNode"            jsdoc/check-param-names
   75:0  warning  Missing JSDoc @param "dp" type      jsdoc/require-param-type
  124:0  warning  The type 'Node' is undefined        jsdoc/no-undefined-types
  135:0  warning  Missing JSDoc @param "node" type    jsdoc/require-param-type
  136:0  warning  The type 'bool' is undefined        jsdoc/no-undefined-types
  142:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
  145:0  warning  Missing JSDoc @param "node" type    jsdoc/require-param-type
  151:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
  154:0  warning  Missing JSDoc @param "node" type    jsdoc/require-param-type
  161:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
  165:0  warning  The type 'Node' is undefined        jsdoc/no-undefined-types
  172:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
  175:0  warning  Missing JSDoc @param "node" type    jsdoc/require-param-type
  190:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
  198:0  warning  The type 'Node' is undefined        jsdoc/no-undefined-types
  212:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
  215:0  warning  Missing JSDoc @param "node" type    jsdoc/require-param-type
  233:0  warning  The type 'TextNode' is undefined    jsdoc/no-undefined-types
  264:0  warning  The type 'Node' is undefined        jsdoc/no-undefined-types
  293:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
  302:0  warning  The type 'Node' is undefined        jsdoc/no-undefined-types
  323:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
  330:0  warning  Missing JSDoc @param "node" type    jsdoc/require-param-type
  354:0  warning  The type 'Node' is undefined        jsdoc/no-undefined-types
  374:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
  377:0  warning  Missing JSDoc @param "node" type    jsdoc/require-param-type
  383:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
  389:0  warning  Missing JSDoc @param "node" type    jsdoc/require-param-type
  419:0  warning  The type 'Node' is undefined        jsdoc/no-undefined-types
  434:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
  439:0  warning  The type 'Frame' is undefined       jsdoc/no-undefined-types
  440:0  warning  The type 'Node' is undefined        jsdoc/no-undefined-types
  449:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
  463:0  warning  Missing JSDoc @param "node" type    jsdoc/require-param-type
  464:0  warning  Missing JSDoc @param "about" type   jsdoc/require-param-type
  490:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
  499:0  warning  Missing JSDoc @param "node" type    jsdoc/require-param-type
  603:0  warning  The type 'Node' is undefined        jsdoc/no-undefined-types
  623:2  warning  Missing JSDoc @return declaration   jsdoc/require-returns
  628:0  warning  Missing JSDoc @param "typeOf" type  jsdoc/require-param-type
  629:0  warning  Missing JSDoc @param "attrs" type   jsdoc/require-param-type
  630:0  warning  Missing JSDoc @param "encode" type  jsdoc/require-param-type

/src/repo/lib/utils/jsutils.js
   59:2  warning  Missing JSDoc @return declaration                                                  jsdoc/require-returns
   66:0  warning  Missing JSDoc @param "it" type                                                     jsdoc/require-param-type
   67:0  warning  Missing JSDoc @param "freezeEntries" type                                          jsdoc/require-param-type
   83:2  warning  Missing JSDoc @return declaration                                                  jsdoc/require-returns
   86:0  warning  Missing JSDoc @param "it" type                                                     jsdoc/require-param-type
   87:0  warning  Missing JSDoc @param "freezeEntries" type                                          jsdoc/require-param-type
   88:0  warning  Missing JSDoc @param "it" type                                                     jsdoc/require-param-type
   88:0  warning  Duplicate @param "it"                                                              jsdoc/check-param-names
   89:0  warning  Missing JSDoc @param "freezeEntries" type                                          jsdoc/require-param-type
  141:2  warning  JSDoc @return declaration present but return expression not available in function  jsdoc/require-returns-check
  160:2  warning  Missing JSDoc @return declaration                                                  jsdoc/require-returns
  164:0  warning  Missing JSDoc @param "obj" type                                                    jsdoc/require-param-type
  180:2  warning  Missing JSDoc @return declaration                                                  jsdoc/require-returns
  185:0  warning  Missing JSDoc @param "n" type                                                      jsdoc/require-param-type
  273:2  warning  Missing JSDoc @return declaration                                                  jsdoc/require-returns
  277:0  warning  Missing JSDoc @param "accum" type                                                  jsdoc/require-param-type
  278:0  warning  Missing JSDoc @param "arr" type                                                    jsdoc/require-param-type
  279:0  warning  Missing JSDoc @param "accum" type                                                  jsdoc/require-param-type
  279:0  warning  Duplicate @param "accum"                                                           jsdoc/check-param-names
  280:0  warning  Missing JSDoc @param "arr" type                                                    jsdoc/require-param-type
  294:2  warning  Found more than one @return declaration                                            jsdoc/require-returns
  294:2  warning  Found more than one @return declaration                                            jsdoc/require-returns-check
  368:0  warning  The type 'true' is undefined                                                       jsdoc/no-undefined-types

/src/repo/lib/wt2html/XMLSerializer.js
  241:1  warning  Missing JSDoc @return declaration              jsdoc/require-returns
  244:0  warning  The type 'Node' is undefined                   jsdoc/no-undefined-types
  260:9  error    'node' is already declared in the upper scope  no-shadow

/src/repo/tests/TestUtils.js
   171:0  warning  The type 'Node' is undefined                      jsdoc/no-undefined-types
   176:0  warning  The type 'Node' is undefined                      jsdoc/no-undefined-types
   283:1  warning  Missing JSDoc @return declaration                 jsdoc/require-returns
   286:0  warning  Missing JSDoc @param "html" type                  jsdoc/require-param-type
   999:3  error    Don't use process.exit(); throw an error instead  no-process-exit
  1015:5  error    Don't use process.exit(); throw an error instead  no-process-exit

/src/repo/tests/api-testing/Parsoid.js
  48:20  error  'url.parse' was deprecated since v11.0.0. Use 'url.URL' constructor instead  node/no-deprecated-api

/src/repo/tests/mockAPI.js
  566:13  error  'text' is already declared in the upper scope  no-shadow

/src/repo/tools/ScriptUtils.js
   57:4  error    Don't use process.exit(); throw an error instead  no-process-exit
   94:2  warning  Missing JSDoc @return declaration                 jsdoc/require-returns
  133:2  warning  Missing JSDoc @return declaration                 jsdoc/require-returns
  171:2  warning  Missing JSDoc @return declaration                 jsdoc/require-returns
  276:2  warning  Missing JSDoc @return declaration                 jsdoc/require-returns
  285:0  warning  Missing JSDoc @param "opts" type                  jsdoc/require-param-type
  286:0  warning  Missing JSDoc @param "defaults" type              jsdoc/require-param-type

/src/repo/tools/build-langconv-fst.js
   14:0   warning  Invalid JSDoc tag name "0@"                   jsdoc/check-tag-names
   18:0   warning  Invalid JSDoc tag name "_IDENTITY_SYMBOL_@"   jsdoc/check-tag-names
   84:21  error    "../lib/language/FST.js" is not found         node/no-missing-require
  326:6   error    'fs' is already declared in the upper scope   no-shadow
  363:10  error    'out' is already declared in the upper scope  no-shadow

/src/repo/tools/compare.linter.results.js
  101:2  error  Don't use process.exit(); throw an error instead  no-process-exit
  108:2  error  Don't use process.exit(); throw an error instead  no-process-exit
  114:2  error  Don't use process.exit(); throw an error instead  no-process-exit

/src/repo/tools/fetch-revision-data.js
  20:31  error  "../lib/mw/ApiRequest.js" is not found               node/no-missing-require
  22:35  error  "../lib/config/MWParserEnvironment.js" is not found  node/no-missing-require

/src/repo/tools/fetch-wt.js
  20:31  error  "../lib/mw/ApiRequest.js" is not found               node/no-missing-require
  22:35  error  "../lib/config/MWParserEnvironment.js" is not found  node/no-missing-require

/src/repo/tools/sync-baseconfig.js
  17:29  error  "../lib/mw/ApiRequest.js" is not found               node/no-missing-require
  18:35  error  "../lib/config/MWParserEnvironment.js" is not found  node/no-missing-require

/src/repo/tools/sync-parserTests.js
    8:0  warning  Expected JSDoc block to be aligned                jsdoc/check-alignment
  189:2  error    Don't use process.exit(); throw an error instead  no-process-exit

✖ 372 problems (39 errors, 333 warnings)
  0 errors and 1 warning potentially fixable with the `--fix` option.


$ ./node_modules/.bin/eslint . -f json
[{"filePath":"/src/repo/baseconfig/2/arwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/be-taraskwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/cawiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/cswiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/dewiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/enwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/eswiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/fawiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/fiwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/frwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/iswiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/kaawiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/lnwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/nlwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/srwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/trwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/2/zhwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/arwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/be-taraskwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/cawiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/cswiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/dewiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/enwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/eswiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/fawiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/fiwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/frwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/iswiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/kaawiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/lnwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/nlwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/srwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/trwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/baseconfig/zhwiki.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/bin/benchmark.js","messages":[{"ruleId":"no-shadow","severity":2,"message":"'config' is already declared in the upper scope.","line":64,"column":22,"nodeType":"Identifier","messageId":"noShadow","endLine":64,"endColumn":28},{"ruleId":"no-process-exit","severity":2,"message":"Don't use process.exit(); throw an error instead.","line":84,"column":4,"nodeType":"CallExpression","messageId":"noProcessExit","endLine":84,"endColumn":20},{"ruleId":"no-shadow","severity":2,"message":"'config' is already declared in the upper scope.","line":136,"column":8,"nodeType":"Identifier","messageId":"noShadow","endLine":136,"endColumn":14},{"ruleId":"no-shadow","severity":2,"message":"'config' is already declared in the upper scope.","line":222,"column":47,"nodeType":"Identifier","messageId":"noShadow","endLine":222,"endColumn":53},{"ruleId":"no-shadow","severity":2,"message":"'config' is already declared in the upper scope.","line":233,"column":16,"nodeType":"Identifier","messageId":"noShadow","endLine":233,"endColumn":22},{"ruleId":"no-process-exit","severity":2,"message":"Don't use process.exit(); throw an error instead.","line":349,"column":4,"nodeType":"CallExpression","messageId":"noProcessExit","endLine":349,"endColumn":19}],"errorCount":6,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"source":"'use strict';\n\nconst fs = require('fs');\nconst yaml = require('js-yaml');\nrequire('../core-upgrade.js');\nconst Promise = require('../lib/utils/promise.js');\nconst request = Promise.promisify(require('request'), true);\n\n// Some semi-arbitrary list of titles\nconst sampleTitles = [\n\t{ wiki: 'enwiki', title: 'Main_Page', revid: 917272779 },\n\t{ wiki: 'enwiki', title: 'Skating', revid: 921619251 },\n\t{ wiki: 'enwiki', title: 'Hospet', revid: 913341503 },\n\t{ wiki: 'enwiki', title: 'Hampi', revid: 921528573 },\n\t{ wiki: 'enwiki', title: 'Berlin', revid: 921687210 },\n\t{ wiki: 'enwiki', title: 'Barack_Obama', revid: 921752860 },\n\t{ wiki: 'enwiki', title: 'Max_Planck_Institute_for_Physics', revid: 921775647 },\n\t{ wiki: 'enwiki', title: 'Architects & Engineers for 9/11 Truth', revid: 921775875 },\n\t{ wiki: 'itwiki', title: 'Luna', revid: 108284424 },\n\t{ wiki: 'itwiki', title: 'Metro', revid: 108262882 },\n\t{ wiki: 'frwiki', title: 'Mulholland_Drive', revid: 149562710 },\n\t{ wiki: 'frwiki', title: 'Metro', revid: 108262882 },\n\t{ wiki: 'frwiki', title: 'François_de_La_Tour_du_Pin', revid: 163623032 },\n\t{ wiki: 'frwiki', title: 'Jason_Bateman', revid: 163623075 },\n\t{ wiki: 'jawiki', title: '人類学', revid: 74657621 },\n\t{ wiki: 'jawiki', title: 'パレオ・インディアン', revid: 70817191 },\n\t{ wiki: 'mediawiki', title: 'Parsoid', revid: 3453996 },\n\t{ wiki: 'mediawiki', title: 'RESTBase', revid: 2962542 },\n\t{ wiki: 'mediawiki', title: 'VisualEditor', revid: 3408339 },\n\t{ wiki: 'dewikivoyage', title: 'Bengaluru', revid: 1224432 },\n\t{ wiki: 'dewikivoyage', title: 'Kopenhagen', revid: 1240570 },\n\t{ wiki: 'dewikivoyage', title: 'Stuttgart', revid: 1226146 },\n\t{ wiki: 'hiwiktionary', title: 'परिवर्णी', revid: 467616 },\n\t{ wiki: 'hiwiktionary', title: 'चीन', revid: 456648 },\n\t{ wiki: 'knwikisource', title: 'ಪಂಪಭಾರತ_ಪ್ರಥಮಾಶ್ವಾಸಂ', revid: 170413 },\n];\n\nlet config = {\n\t// File with \\n-separated json blobs with at least (wiki, title, oldId / revid) properties\n\t// If domain is provided, it is used, if not wiki is treated as a prefix\n\t// All other properties are ignored.\n\t// If this property is null, sampleTitles above is used\n\ttestTitles: null, // '/tmp/logs',\n\tmode: 'wt2html',\n\tjsServer: {\n\t\tbaseURI: 'http://localhost:8142',\n\t\tproxy: '',\n\t},\n\tphpServer: {\n\t\tbaseURI: 'http://DOMAIN/w/rest.php',\n\t\tproxy: '', // 'http://scandium.eqiad.wmnet:80',\n\t},\n\tmaxOutstanding: 4,\n\tmaxRequests: 25,\n\tverbose: true\n};\n\nconst state = {\n\ttimes: [],\n\tnumPendingRequests: 0,\n\toutStanding: 0\n};\n\nfunction genFullUrls(config, domain, title, revid) {\n\tlet initRestFragment, restFragment;\n\n\tswitch (config.mode || 'wt2html') {\n\t\tcase 'wt2html':\n\t\t\trestFragment = `${domain}/v3/page/html/${encodeURIComponent(title)}/${revid}`;\n\t\t\tbreak;\n\t\tcase 'wt2pb':\n\t\t\trestFragment = `${domain}/v3/page/pagebundle/${encodeURIComponent(title)}/${revid}`;\n\t\t\tbreak;\n\t\tcase 'html2wt':\n\t\t\tinitRestFragment = `${domain}/v3/page/html/${encodeURIComponent(title)}/${revid}`;\n\t\t\trestFragment = `${domain}/v3/transform/html/to/wikitext/${encodeURIComponent(title)}/${revid}`;\n\t\t\tbreak;\n\t\tcase 'pb2wt':\n\t\t\tinitRestFragment = `${domain}/v3/page/pagebundle/${encodeURIComponent(title)}/${revid}`;\n\t\t\trestFragment = `${domain}/v3/transform/pagebundle/to/wikitext/${encodeURIComponent(title)}/${revid}`;\n\t\t\tbreak;\n\t\tdefault:\n\t\t\tconsole.log(\"Mode \" + config.mode + \" is not supported right now.\");\n\t\t\tprocess.exit(-1);\n\t}\n\treturn {\n\t\tjs : `${config.jsServer.baseURI}/${restFragment}`,\n\t\tphp: `${config.phpServer.baseURI.replace(/DOMAIN/, domain)}/${restFragment}`,\n\t\tinit: initRestFragment ? `${config.phpServer.baseURI.replace(/DOMAIN/, domain)}/${initRestFragment}` : null,\n\t\tjsTime: null,\n\t\tphpTime: null,\n\t};\n}\n\nfunction prefixToDomain(prefix) {\n\tif (prefix === 'commonswiki') {\n\t\treturn 'commons.wikimedia.org';\n\t}\n\n\tif (prefix === 'metawiki') {\n\t\treturn 'meta.wikimedia.org';\n\t}\n\n\tif (prefix === 'wikidatawiki') {\n\t\treturn 'wikidata.org';\n\t}\n\n\tif (prefix === 'mediawiki' || prefix === 'mediawikiwiki') {\n\t\treturn 'www.mediawiki.org';\n\t}\n\n\tif (/wiki$/.test(prefix)) {\n\t\treturn prefix.replace(/wiki$/, '.wikipedia.org');\n\t}\n\n\tconst project = [ 'wiktionary', 'wikisource', 'wikivoyage', 'wikibooks', 'wikiquote', 'wikinews', 'wikiversity' ].find(function(p) {\n\t\treturn prefix.endsWith(p);\n\t});\n\n\treturn project ? `${prefix.substr(0, prefix.length - project.length)}.${project}.org` : null;\n}\n\nfunction contentFileName(url) {\n\t// Hacky\n\tconst suffix = /.*v3\\/(page|transform)\\/pagebundle/.test(url) ? 'pb.json' : 'html';\n\tconst wiki = url.replace(/\\/v3\\/.*/, '').replace(/.*\\//, '');\n\treturn '/tmp/' + wiki + \".\" + url.replace(/.*\\//, '') + \".php.\" + suffix;\n}\n\nfunction fetchPageContent(url) {\n\tconst fileName = contentFileName(url);\n\treturn fs.existsSync(fileName) ? fs.readFileSync(fileName, 'utf8') : null;\n}\n\nfunction issueRequest(opts, url, finalizer) {\n\tconst config = opts.config;\n\tconst fromWT = opts.mode === 'wt2html' || opts.mode === 'wt2pb';\n\tconst httpOptions = {\n\t\tmethod: fromWT ? 'GET' : 'POST',\n\t\theaders: { 'User-Agent': 'Parsoid-Test' },\n\t\tproxy: opts.proxy,\n\t\turi: fromWT ? url : url.replace(/\\/\\d+$/, ''), // strip oldid to suppress selser\n\t};\n\n\tif (!fromWT) {\n\t\thttpOptions.headers['Content-Type'] = 'application/json';\n\t\tconst content = fetchPageContent(url);\n\t\tif (!content) {\n\t\t\tconsole.log(\"Aborting request! Content not found @ \" + contentFileName(url));\n\t\t\t// Abort\n\t\t\tstate.numPendingRequests--;\n\t\t\tif (state.numPendingRequests === 0 && state.outStanding === 0) {\n\t\t\t\tconsole.log('resolving after abort');\n\t\t\t\tfinalizer();\n\t\t\t}\n\t\t\treturn;\n\t\t}\n\n\t\tif (opts.mode === 'pb2wt') {\n\t\t\tconst pb = JSON.parse(content);\n\t\t\thttpOptions.body = {\n\t\t\t\thtml: pb.html.body,\n\t\t\t\toriginal : {\n\t\t\t\t\t'data-parsoid': pb['data-parsoid']\n\t\t\t\t\t// non-selser mode, so don't need wikitext\n\t\t\t\t},\n\t\t\t};\n\t\t} else  {\n\t\t\thttpOptions.body = {\n\t\t\t\t'html': content\n\t\t\t};\n\t\t}\n\t\thttpOptions.body = JSON.stringify(httpOptions.body);\n\t}\n\n\tconst reqId = state.numPendingRequests;\n\tif (config.verbose) {\n\t\tconsole.log(`--> ID=${reqId}; URL:${url}; PENDING=${state.numPendingRequests}; OUTSTANDING=${state.outStanding}`);\n\t}\n\tstate.numPendingRequests--;\n\tstate.outStanding++;\n\tconst startTime = process.hrtime();\n\treturn request(httpOptions)\n\t.catch(function(error) { console.log(\"errrorr!\" + error); })\n\t.then(function(ret) {\n\t\tstate.outStanding--;\n\t\tif (opts.type === 'init') {\n\t\t\tfs.writeFileSync(contentFileName(url), ret[1]);\n\t\t\tif (state.numPendingRequests === 0 && state.outStanding === 0) {\n\t\t\t\tfinalizer();\n\t\t\t}\n\t\t} else {\n\t\t\tconst endTime = process.hrtime();\n\t\t\tconst reqTime = Math.round((endTime[0] * 1e9 + endTime[1]) / 1e6 - (startTime[0] * 1e9 + startTime[1]) / 1e6);\n\t\t\tif (config.verbose) {\n\t\t\t\tconsole.log(`<-- ID=${reqId}; URL:${url}; TIME=${reqTime}; STATUS: ${ret[0].statusCode}; LEN: ${ret[1].length}`);\n\t\t\t}\n\t\t\tif (!opts.results[reqId]) {\n\t\t\t\topts.results[reqId] = {\n\t\t\t\t\turl: url,\n\t\t\t\t};\n\t\t\t}\n\t\t\topts.results[reqId][opts.type + 'Time'] = reqTime;\n\t\t\tstate.times.push(reqTime);\n\t\t\tif (state.numPendingRequests === 0 && state.outStanding === 0) {\n\t\t\t\tconst res = state.times.reduce((stats, n) => {\n\t\t\t\t\tstats.sum += n;\n\t\t\t\t\tstats.min = n < stats.min ? n : stats.min;\n\t\t\t\t\tstats.max = n > stats.max ? n : stats.max;\n\t\t\t\t\treturn stats;\n\t\t\t\t}, { sum: 0, min: 1000000, max: 0 });\n\t\t\t\tres.avg = res.sum / state.times.length;\n\t\t\t\tres.median = state.times.sort((a, b) => a - b)[Math.floor(state.times.length / 2)];\n\t\t\t\tconsole.log(`\\n${opts.type.toUpperCase()} STATS: ${JSON.stringify(res)}`);\n\t\t\t\tfinalizer();\n\t\t\t}\n\t\t}\n\t})\n\t.catch(function(error) { console.log(\"errrorr!\" + error); });\n}\n\nfunction computeRandomRequestStream(testUrls, config) {\n\tconst numReqs = config.maxRequests;\n\tconst reqs = [];\n\tconst n = testUrls.length;\n\tfor (let i = 0; i < numReqs; i++) {\n\t\t// Pick a random url\n\t\treqs.push(testUrls[Math.floor(Math.random() * n)]);\n\t}\n\treturn reqs;\n}\n\nfunction reset(config) {\n\tstate.times = [];\n\tstate.numPendingRequests = config.maxRequests;\n\tstate.outStanding = 0; // # outstanding reqs\n}\n\nfunction runTests(opts, finalizer) {\n\tif (state.numPendingRequests > 0) {\n\t\tif (state.outStanding < opts.config.maxOutstanding) {\n\t\t\tconst url = opts.reqs[opts.reqs.length - state.numPendingRequests][opts.type];\n\t\t\tif (opts.type === 'js') {\n\t\t\t\topts.proxy = config.jsServer.proxy || '';\n\t\t\t} else { // 'php' or 'init' For init, content is always fetched from Parsoid/PHP\n\t\t\t\topts.proxy = config.phpServer.proxy || '';\n\t\t\t}\n\t\t\tif (opts.type === 'init' && fs.existsSync(contentFileName(url))) {\n\t\t\t\t// Content exists. Don't fetch.\n\t\t\t\tstate.numPendingRequests--;\n\t\t\t\tif (state.numPendingRequests === 0 && state.outStanding === 0) {\n\t\t\t\t\tfinalizer();\n\t\t\t\t\treturn;\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tissueRequest(opts, url, finalizer);\n\t\t\t}\n\t\t}\n\t\tsetImmediate(() => runTests(opts, finalizer));\n\t}\n}\n\n// Override default config\nif (process.argv.length > 2) {\n\tconfig = yaml.load(fs.readFileSync(process.argv[2], 'utf8'));\n}\n\n// CLI overrides config\nif (process.argv.length > 3) {\n\tconfig.maxOutstanding = parseInt(process.argv[3], 10);\n}\n\n// CLI overrides config\nif (process.argv.length > 4) {\n\tconfig.maxRequests = parseInt(process.argv[4], 10);\n}\n\nlet testUrls;\nif (config.testTitles) {\n\t// Parse production logs and generate test urls\n\tconst logs = fs.readFileSync(config.testTitles, 'utf8');\n\tconst lines = logs.split(/\\n/);\n\ttestUrls = [];\n\tlines.forEach(function(l) {\n\t\tif (l) {\n\t\t\tconst log = JSON.parse(l);\n\t\t\tconst domain = log.domain || prefixToDomain(log.wiki);\n\t\t\tif (domain) {\n\t\t\t\ttestUrls.push(genFullUrls(config, domain, log.title, log.oldId || log.revid));\n\t\t\t}\n\t\t}\n\t});\n} else {\n\ttestUrls = [];\n\tsampleTitles.forEach(function(t) {\n\t\ttestUrls.push(genFullUrls(config, t.domain || prefixToDomain(t.wiki), t.title, t.revid));\n\t});\n}\n\nconst reqStream = computeRandomRequestStream(testUrls, config);\nconst opts = {\n\tconfig: config,\n\treqs: reqStream,\n\tresults: [],\n};\n\nlet p;\nif (/2wt$/.test(config.mode)) {\n\t// Fetch pb / html as necessary and save to disk\n\t// so we can run and benchmark pb2wt or html2wt after\n\tp = new Promise(function(resolve, reject) {\n\t\topts.type = 'init';\n\t\topts.mode = config.mode === 'pb2wt' ? 'wt2pb' : 'wt2html';\n\t\tconsole.log(\"--- Initialization ---\");\n\t\treset(config);\n\t\trunTests(opts, function() {\n\t\t\tconsole.log(\"--- Initialization done---\");\n\t\t\tresolve();\n\t\t});\n\t});\n} else {\n\tp = Promise.resolve();\n}\n\np.then(function() {\n\treset(config);\n\topts.type = 'js';\n\topts.mode = config.mode;\n\tconsole.log(\"\\n\\n--- JS tests ---\");\n\trunTests(opts, function() {\n\t\tconsole.log(\"\\n\\n--- PHP tests---\");\n\t\treset(config);\n\t\topts.type = 'php';\n\t\topts.mode = config.mode;\n\t\trunTests(opts, function() {\n\t\t\tconsole.log(\"\\n--- All done---\\n\");\n\t\t\tlet numJSFaster = 0;\n\t\t\tlet numPHPFaster = 0;\n\t\t\topts.results.forEach(function(r) {\n\t\t\t\tif (r.jsTime < r.phpTime) {\n\t\t\t\t\tnumJSFaster++;\n\t\t\t\t\tconsole.log(`For ${r.url}, Parsoid/JS was faster than Parsoid/PHP (${r.jsTime} vs. ${r.phpTime})`);\n\t\t\t\t} else {\n\t\t\t\t\tnumPHPFaster++;\n\t\t\t\t}\n\t\t\t});\n\t\t\tconsole.log('\\n# of reqs where Parsoid/JS was faster than Parsoid/PHP: ' + numJSFaster);\n\t\t\tconsole.log('# of reqs where Parsoid/PHP was faster than Parsoid/JS: ' + numPHPFaster);\n\t\t\tprocess.exit(0);\n\t\t});\n\t});\n}).done();\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/bin/diff.html.js","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/bin/domdiff.test.js","messages":[{"ruleId":"node/no-missing-require","severity":2,"message":"\"../lib/html2wt/DOMDiff.js\" is not found.","line":7,"column":23,"nodeType":"Literal","endLine":7,"endColumn":50},{"ruleId":"node/no-missing-require","severity":2,"message":"\"../lib/logger/ParsoidLogger.js\" is not found.","line":10,"column":29,"nodeType":"Literal","endLine":10,"endColumn":61},{"ruleId":"no-process-exit","severity":2,"message":"Don't use process.exit(); throw an error instead.","line":88,"column":2,"nodeType":"CallExpression","messageId":"noProcessExit","endLine":88,"endColumn":17}],"errorCount":3,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"source":"#!/usr/bin/env node\n\n'use strict';\n\nrequire('../core-upgrade.js');\n\nvar DOMDiff = require('../lib/html2wt/DOMDiff.js').DOMDiff;\nvar ScriptUtils = require('../tools/ScriptUtils.js').ScriptUtils;\nvar ContentUtils = require('../lib/utils/ContentUtils.js').ContentUtils;\nvar ParsoidLogger = require('../lib/logger/ParsoidLogger.js').ParsoidLogger;\nvar MockEnv = require('../tests/MockEnv.js').MockEnv;\nvar Promise = require('../lib/utils/promise.js');\nvar yargs = require('yargs');\nvar fs = require('pn/fs');\n\nvar opts = yargs\n.usage(\"Usage: $0 [options] [old-html-file new-html-file]\\n\\nProvide either inline html OR 2 files\")\n.options({\n\thelp: {\n\t\tdescription: 'Show this message',\n\t\t'boolean': true,\n\t\t'default': false,\n\t},\n\toldhtml: {\n\t\tdescription: 'Old html',\n\t\t'boolean': false,\n\t\t'default': null,\n\t},\n\tnewhtml: {\n\t\tdescription: 'New html',\n\t\t'boolean': false,\n\t\t'default': null,\n\t},\n\tquiet: {\n\t\tdescription: 'Emit only the marked-up HTML',\n\t\t'boolean': true,\n\t\t'default': false,\n\t},\n\tdebug: {\n\t\tdescription: 'Debug mode',\n\t\t'boolean': true,\n\t\t'default': false,\n\t},\n});\n\nPromise.async(function *() {\n\tvar argv = opts.argv;\n\tvar oldhtml = argv.oldhtml;\n\tvar newhtml = argv.newhtml;\n\n\tif (!oldhtml && argv._[0]) {\n\t\toldhtml = yield fs.readFile(argv._[0], 'utf8');\n\t\tnewhtml = yield fs.readFile(argv._[1], 'utf8');\n\t}\n\n\tif (ScriptUtils.booleanOption(argv.help) || !oldhtml || !newhtml) {\n\t\topts.showHelp();\n\t\treturn;\n\t}\n\n\tconst dummyEnv = new MockEnv({\n\t\tdebug: ScriptUtils.booleanOption(argv.debug),\n\t}, null);\n\n\t// FIXME: Move to `MockEnv`\n\tif (argv.debug) {\n\t\tvar logger = new ParsoidLogger(dummyEnv);\n\t\tlogger.registerBackend(/^(trace|debug)(\\/|$)/, logger.getDefaultTracerBackend());\n\t\tdummyEnv.log = (...args) => logger.log(...args);\n\t} else {\n\t\tdummyEnv.log = function() {};\n\t}\n\n\tvar oldDOM = ContentUtils.ppToDOM(dummyEnv, oldhtml, { markNew: true });\n\tvar newDOM = ContentUtils.ppToDOM(dummyEnv, newhtml, { markNew: true });\n\n\tContentUtils.stripSectionTagsAndFallbackIds(oldDOM);\n\tContentUtils.stripSectionTagsAndFallbackIds(newDOM);\n\n\t(new DOMDiff(dummyEnv)).diff(oldDOM, newDOM);\n\n\tContentUtils.dumpDOM(newDOM, 'DIFF-marked DOM', {\n\t\tquiet: !!ScriptUtils.booleanOption(argv.quiet),\n\t\tstoreDiffMark: true,\n\t\tenv: dummyEnv,\n\t});\n\n\tprocess.exit(0);\n})().done();\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/bin/inspectTokenizer.js","messages":[{"ruleId":"no-process-exit","severity":2,"message":"Don't use process.exit(); throw an error instead.","line":211,"column":2,"nodeType":"CallExpression","messageId":"noProcessExit","endLine":211,"endColumn":17}],"errorCount":1,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"source":"#!/usr/bin/env node\n\n'use strict';\n\nvar yargs = require('yargs');\nvar PegTokenizer = require('../lib/wt2html/tokenizer.js').PegTokenizer;\nvar fs = require('fs');\n\nyargs.usage('Inspect the PEG.js grammar and generated source.');\n\n//\t'Inspect the PEG.js grammar and generated source');\n\nyargs.options({\n\t'source': {\n\t\tdescription: 'Show tokenizer source code',\n\t\t'boolean': true,\n\t\t'default': false,\n\t},\n\n\t'rules': {\n\t\tdescription: 'Show rule action source code',\n\t\t'boolean': true,\n\t\t'default': false,\n\t},\n\n\t'callgraph': {\n\t\tdescription: 'Write out a DOT graph of rule dependencies',\n\t\t'boolean': true,\n\t\t'default': false,\n\t},\n\n\t'list-orphans': {\n\t\tdescription: 'List rules that are not called by any other rule',\n\t\t'boolean': true,\n\t\t'default': false,\n\t},\n\n\t'outfile': {\n\t\tdescription: 'File name to write the output to',\n\t\t'boolean': false,\n\t\t'default': '-',\n\t\t'alias': 'o'\n\t},\n\n\t'php': {\n\t\tdescription: 'Use the PHP grammar',\n\t\t'boolean': true,\n\t\t'default': false,\n\t},\n\n\t'trace': {\n\t\tdescription: 'Generate code that logs rule transitions to stdout',\n\t\t'boolean': true,\n\t\t'default': false,\n\t},\n});\n\nyargs.help();\n\nfunction getOutputStream(opts) {\n\tif (!opts.outfile || opts.outfile === '-') {\n\t\treturn process.stdout;\n\t} else {\n\t\treturn fs.createWriteStream(opts.outfile);\n\t}\n}\n\nfunction generateSource(opts) {\n\tvar file = getOutputStream(opts);\n\tvar tokenizer = new PegTokenizer();\n\tvar pegOpts = { trace: opts.trace };\n\tvar source = tokenizer.compileTokenizer(tokenizer.parseTokenizer(pegOpts), pegOpts);\n\tfile.write(source, 'utf8');\n}\n\nfunction generateRules(opts) {\n\tvar file = getOutputStream(opts);\n\tvar tokenizer = new PegTokenizer();\n\tvar pegOpts = { php: opts.php };\n\tvar ast = tokenizer.parseTokenizer(pegOpts);\n\tvar visitor = require('wikipeg/lib/compiler/visitor');\n\n\t// Current code style seems to use spaces in the tokenizer.\n\tvar tab = '    ';\n\t// Add some eslint overrides and define globals.\n\tvar rulesSource = '/* eslint-disable indent,camelcase,no-unused-vars */\\n';\n\trulesSource += \"\\n'use strict';\\n\\n\";\n\trulesSource += 'var options, location, input, text, peg$cache, peg$currPos, peg$savedPos;\\n';\n\t// Prevent redefinitions of variables involved in choice expressions\n\tvar seen = new Set();\n\tvar addVar = function(name) {\n\t\tif (!seen.has(name)) {\n\t\t\trulesSource += tab + 'var ' + name + ' = null;\\n';\n\t\t\tseen.add(name);\n\t\t}\n\t};\n\t// Collect all the code blocks in the AST.\n\tvar dumpCode = function(node) {\n\t\tif (node.code) {\n\t\t\t// remove trailing whitespace for single-line predicates\n\t\t\tvar code = node.code.replace(/[ \\t]+$/, '');\n\t\t\t// wrap with a function, to prevent spurious errors caused\n\t\t\t// by redeclarations or multiple returns in a block.\n\t\t\trulesSource += tab + '(function() {\\n' + code + '\\n' +\n\t\t\t\ttab + '})();\\n';\n\t\t}\n\t};\n\tvar visit = visitor.build({\n\t\tinitializer: function(node) {\n\t\t\tif (node.code) {\n\t\t\t\trulesSource += node.code + '\\n';\n\t\t\t}\n\t\t},\n\t\tsemantic_and: dumpCode,\n\t\tsemantic_node: dumpCode,\n\t\trule: function(node) {\n\t\t\trulesSource += 'function rule_' + node.name + '() {\\n';\n\t\t\tseen.clear();\n\t\t\tvisit(node.expression);\n\t\t\trulesSource += '}\\n';\n\t\t},\n\t\tlabeled: function(node) {\n\t\t\taddVar(node.label);\n\t\t\tvisit(node.expression);\n\t\t},\n\t\tlabeled_param: function(node) {\n\t\t\taddVar(node.label);\n\t\t},\n\t\tnamed: function(node) {\n\t\t\taddVar(node.name);\n\t\t\tvisit(node.expression);\n\t\t},\n\t\taction: function(node) {\n\t\t\tvisit(node.expression);\n\t\t\tdumpCode(node);\n\t\t},\n\t});\n\tvisit(ast);\n\t// Write rules to file.\n\tfile.write(rulesSource, 'utf8');\n}\n\nfunction generateCallgraph(opts) {\n\tvar file = getOutputStream(opts);\n\tvar tokenizer = new PegTokenizer();\n\tvar pegOpts = { php: opts.php };\n\tvar ast = tokenizer.parseTokenizer(pegOpts);\n\tvar visitor = require('wikipeg/lib/compiler/visitor');\n\tvar edges = [];\n\tvar currentRuleName;\n\n\tvar visit = visitor.build({\n\t\trule: function(node) {\n\t\t\tcurrentRuleName = node.name;\n\t\t\tvisit(node.expression);\n\t\t},\n\n\t\trule_ref: function(node) {\n\t\t\tvar edge = \"\\t\" + currentRuleName + \" -> \" + node.name + \";\";\n\t\t\tif (edges.indexOf(edge) === -1) {\n\t\t\t\tedges.push(edge);\n\t\t\t}\n\t\t}\n\t});\n\n\tvisit(ast);\n\n\tvar dot = \"digraph {\\n\" +\n\t\tedges.join(\"\\n\") + \"\\n\" +\n\t\t\"}\\n\";\n\n\tfile.write(dot, 'utf8');\n}\n\nfunction listOrphans(opts) {\n\tvar file = getOutputStream(opts);\n\tvar tokenizer = new PegTokenizer();\n\tvar pegOpts = { php: opts.php };\n\tvar ast = tokenizer.parseTokenizer(pegOpts);\n\tvar visitor = require('wikipeg/lib/compiler/visitor');\n\n\tvar rules = {};\n\n\tvisitor.build({\n\t\trule: function(node) {\n\t\t\trules[node.name] = true;\n\t\t},\n\t})(ast);\n\n\tvisitor.build({\n\t\trule_ref: function(node) {\n\t\t\tdelete rules[node.name];\n\t\t},\n\t})(ast);\n\n\tfile.write(Object.getOwnPropertyNames(rules).join('\\n') + '\\n');\n}\n\nvar opts = yargs.argv;\n\nif (opts.source) {\n\tgenerateSource(opts);\n} else if (opts.rules) {\n\tgenerateRules(opts);\n} else if (opts.callgraph) {\n\tgenerateCallgraph(opts);\n} else if (opts['list-orphans']) {\n\tlistOrphans(opts);\n} else {\n\tconsole.error(\"Either --source, --rules, --callgraph or --list-orphans must be specified\");\n\tprocess.exit(1);\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/bin/langconv-test.js","messages":[{"ruleId":"node/no-missing-require","severity":2,"message":"\"../lib/mw/ApiRequest.js\" is not found.","line":12,"column":51,"nodeType":"Literal","endLine":12,"endColumn":76},{"ruleId":"node/no-missing-require","severity":2,"message":"\"../lib/config/MWParserEnvironment.js\" is not found.","line":17,"column":41,"nodeType":"Literal","endLine":17,"endColumn":79},{"ruleId":"node/no-missing-require","severity":2,"message":"\"../lib/mw/ApiRequest.js\" is not found.","line":20,"column":37,"nodeType":"Literal","endLine":20,"endColumn":62},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":208,"column":1,"nodeType":"Block","endLine":213,"endColumn":4},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"env\" type.","line":211,"column":null,"nodeType":"Block","endLine":211,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"document\" type.","line":212,"column":null,"nodeType":"Block","endLine":212,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":223,"column":1,"nodeType":"Block","endLine":230,"endColumn":4},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"env\" type.","line":228,"column":null,"nodeType":"Block","endLine":228,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"document\" type.","line":229,"column":null,"nodeType":"Block","endLine":229,"endColumn":null},{"ruleId":"no-shadow","severity":2,"message":"'env' is already declared in the upper scope.","line":246,"column":32,"nodeType":"Identifier","messageId":"noShadow","endLine":246,"endColumn":35},{"ruleId":"no-process-exit","severity":2,"message":"Don't use process.exit(); throw an error instead.","line":613,"column":4,"nodeType":"CallExpression","messageId":"noProcessExit","endLine":613,"endColumn":26}],"errorCount":5,"warningCount":6,"fixableErrorCount":0,"fixableWarningCount":0,"source":"#!/usr/bin/env node\n\n'use strict';\n\nrequire('../core-upgrade.js');\n\nconst colors = require('colors');\nconst fs = require('pn/fs');\nconst path = require('path');\nconst yargs = require('yargs');\n\nconst { ApiRequest, DoesNotExistError } = require('../lib/mw/ApiRequest.js');\nconst { Diff } = require('../lib/utils/Diff.js');\nconst { DOMDataUtils } = require('../lib/utils/DOMDataUtils.js');\nconst { DOMTraverser } = require('../lib/utils/DOMTraverser.js');\nconst { DOMUtils } = require('../lib/utils/DOMUtils.js');\nconst { MWParserEnvironment } = require('../lib/config/MWParserEnvironment.js');\nconst { ParsoidConfig } = require('../lib/config/ParsoidConfig.js');\nconst Promise = require('../lib/utils/promise.js');\nconst { TemplateRequest } = require('../lib/mw/ApiRequest.js');\nconst { Util } = require('../lib/utils/Util.js');\nconst { ScriptUtils } = require('../tools/ScriptUtils.js');\n\nconst jsonFormat = function(error, domain, title, lang, options, results) {\n\tif (error) { return { error: error.stack || error.toString() }; }\n\tconst p = Diff.patchDiff(results.php, results.parsoid);\n\treturn { patch: p };\n};\n\nconst plainFormat = function(error, domain, title, lang, options, results) {\n\tif (error) { return error.stack || error.toString(); }\n\tconst article = `${domain} ${title} ${lang || ''}`;\n\tconst diff = Diff.colorDiff(results.php, results.parsoid, {\n\t\tcontext: 1,\n\t\tnoColor: (colors.mode === 'none'),\n\t\tdiffCount: true,\n\t});\n\tif (diff.count === 0) { return ''; }\n\treturn `== ${article} ==\\n${diff.output}\\n${diff.count} different words found.\\n`;\n};\n\nconst xmlFormat = function(error, domain, title, lang, options, results) {\n\tconst article = `${domain} ${title} ${lang || ''}`;\n\tlet output = '<testsuites>\\n';\n\toutput += `<testsuite name=\"Variant ${Util.escapeHtml(article)}\">\\n`;\n\toutput += `<testcase name=\"revision ${results.revid}\">\\n`;\n\tif (error) {\n\t\toutput += '<error type=\"parserFailedToFinish\">';\n\t\toutput += Util.escapeHtml(error.stack || error.toString());\n\t\toutput += '</error>';\n\t} else if (results.php !== results.parsoid) {\n\t\toutput += '<failure type=\"diff\">\\n<diff class=\"html\">\\n';\n\t\toutput += Diff.colorDiff(results.php, results.parsoid, {\n\t\t\tcontext: 1,\n\t\t\thtml: true,\n\t\t\tseparator: '</diff></failure>\\n' +\n\t\t\t\t'<failure type=\"diff\"><diff class=\"html\">',\n\t\t});\n\t\toutput += '\\n</diff>\\n</failure>\\n';\n\t}\n\toutput += '</testcase>\\n';\n\toutput += '</testsuite>\\n';\n\toutput += '</testsuites>\\n';\n\treturn output;\n};\n\nconst silentFormat = function(error, domain, title, lang, options, results) {\n\treturn '';\n};\n\nclass PHPVariantRequest extends ApiRequest {\n\tconstructor(env, title, variant, revid) {\n\t\tsuper(env, title);\n\t\tthis.reqType = \"Variant Parse\";\n\n\t\tconst apiargs = {\n\t\t\tformat: 'json',\n\t\t\taction: 'parse',\n\t\t\tpage: title,\n\t\t\tprop: 'text|revid|displaytitle',\n\t\t\tuselang: 'content',\n\t\t\twrapoutputclass: '',\n\t\t\tdisableeditsection: 'true',\n\t\t\tdisabletoc: 'true',\n\t\t\tdisablelimitreport: 'true'\n\t\t};\n\t\tif (revid) {\n\t\t\t// The parameters `page` and `oldid` can't be used together\n\t\t\tapiargs.page = undefined;\n\t\t\tapiargs.oldid = revid;\n\t\t}\n\t\t// This argument to the API is not documented!  Except in\n\t\t// https://phabricator.wikimedia.org/T44356#439479 and\n\t\t// https://phabricator.wikimedia.org/T34906#381101\n\t\tif (variant) { apiargs.variant = variant; }\n\n\t\tconst uri = env.conf.wiki.apiURI;\n\t\tthis.requestOptions = {\n\t\t\turi,\n\t\t\tmethod: 'POST',\n\t\t\tform: apiargs, // The API arguments\n\t\t\tfollowRedirect: true,\n\t\t\ttimeout: env.conf.parsoid.timeouts.mwApi.extParse,\n\t\t};\n\n\t\tthis.request(this.requestOptions);\n\t}\n\t_handleJSON(error, data) {\n\t\tif (!error && !(data && data.parse)) {\n\t\t\terror = this._errorObj(data, this.text, 'Missing data.parse.');\n\t\t}\n\n\t\tif (error) {\n\t\t\tthis.env.log(\"error\", error);\n\t\t\tthis._processListeners(error, '');\n\t\t} else {\n\t\t\tthis._processListeners(error, data.parse);\n\t\t}\n\t}\n\t_errorObj(data, requestStr, defaultMsg) {\n\t\tif (data && data.error && data.error.code === 'missingtitle') {\n\t\t\treturn new DoesNotExistError(this.title);\n\t\t}\n\t\treturn super._errorObj(data, requestStr, defaultMsg);\n\t}\n}\n\nconst phpFetch = Promise.async(function *(env, title, revid) {\n\tconst parse = yield new Promise((resolve, reject) => {\n\t\tconst req = new PHPVariantRequest(\n\t\t\tenv, title, env.htmlVariantLanguage, revid\n\t\t);\n\t\treq.once('src', (err, src) => {\n\t\t\treturn err ? reject(err) : resolve(src);\n\t\t});\n\t});\n\tconst document = DOMUtils.parseHTML(parse.text['*']);\n\tconst displaytitle = parse.displaytitle;\n\trevid = parse.revid;\n\treturn {\n\t\tdocument,\n\t\trevid,\n\t\tdisplaytitle\n\t};\n});\n\nconst parsoidFetch = Promise.async(function *(env, title, options) {\n\tif (!options.useServer) {\n\t\tyield TemplateRequest.setPageSrcInfo(env, title, options.oldid);\n\t\tconst revision = env.page.meta.revision;\n\t\tconst handler = env.getContentHandler(revision.contentmodel);\n\t\tconst document = yield handler.toHTML(env);\n\t\treturn {\n\t\t\tdocument,\n\t\t\trevid: revision.revid,\n\t\t\tdisplaytitle: document.title,\n\t\t};\n\t}\n\tconst domain = options.domain;\n\tlet uri = options.uri;\n\t// Make sure the Parsoid URI ends with `/`\n\tif (!/\\/$/.test(uri)) {\n\t\turi += '/';\n\t}\n\turi += `${domain}/v3/page/html/${encodeURIComponent(title)}`;\n\tif (options.oldid) {\n\t\turi += `/${options.oldid}`;\n\t}\n\tconst resp = yield ScriptUtils.retryingHTTPRequest(10, {\n\t\tmethod: 'GET',\n\t\turi,\n\t\theaders: {\n\t\t\t'User-Agent': env.userAgent,\n\t\t\t'Accept-Language': env.htmlVariantLanguage,\n\t\t}\n\t});\n\t// We may have been redirected to the latest revision. Record oldid.\n\tconst res = resp[0];\n\tconst body = resp[1];\n\tif (res.statusCode !== 200) {\n\t\tthrow new Error(`Can\\'t fetch Parsoid source: ${uri}`);\n\t}\n\tconst oldid = res.request.path.replace(/^(.*)\\//, '');\n\tconst document = DOMUtils.parseHTML(body);\n\treturn {\n\t\tdocument,\n\t\trevid: oldid,\n\t\tdisplaytitle: document.title,\n\t};\n});\n\nconst hrefToTitle = function(href) {\n\treturn Util.decodeURIComponent(href.replace(/^(\\.\\.?|\\/wiki)\\//, ''))\n\t\t.replace(/_/g, ' ');\n};\n\nconst nodeHrefToTitle = function(node, suppressCategory) {\n\tconst href = node && node.hasAttribute('href') && node.getAttribute('href');\n\tif (!href) { return null; }\n\tconst title = hrefToTitle(href);\n\tif (suppressCategory) {\n\t\tconst categoryMatch = title.match(/^([^:]+)[:]/);\n\t\tif (categoryMatch) { return null; /* skip it */ }\n\t}\n\treturn title;\n};\n\n/**\n * Pull a list of local titles from wikilinks in a Parsoid HTML document.\n *\n * @param env\n * @param document\n */\nconst spiderDocument = function(env, document) {\n\tconst redirect = document.querySelector('link[rel~=\"mw:PageProp/redirect\"]');\n\tconst nodes = redirect ? [ redirect ] :\n\t\tArray.from(document.querySelectorAll('a[rel~=\"mw:WikiLink\"][href]'));\n\treturn new Set(\n\t\tnodes.map(node => nodeHrefToTitle(node, true)).filter(t => t !== null)\n\t);\n};\n\n/**\n * Pull \"just the text\" from an HTML document, normalizing whitespace\n * differences and suppressing places where Parsoid and PHP output\n * deliberately differs.\n *\n * @param env\n * @param document\n */\nconst extractText = function(env, document) {\n\tvar dt = new DOMTraverser();\n\tvar sep = '';\n\tvar buf = '';\n\t/* We normalize all whitespace in text nodes to a single space. We\n\t * do insert newlines in the output, but only to delimit block\n\t * elements.  Even there, we are careful never to emit two newlines\n\t * in a row, or whitespace before or after a newline. */\n\tconst addSep = (s) => {\n\t\tif (s === '') { return; }\n\t\tif (/\\n/.test(s)) { sep = '\\n'; return; }\n\t\tif (sep === '\\n') { return; }\n\t\tsep = ' ';\n\t};\n\tconst emit = (s) => { if (s !== '') { buf += sep; buf += s; sep = ''; } };\n\tdt.addHandler('#text', (node, env, atTopLevel, tplInfo) => {\n\t\tconst v = node.nodeValue.replace(/\\s+/g, ' ');\n\t\tconst m = /^(\\s*)(.*?)(\\s*)$/.exec(v);\n\t\taddSep(m[1]);\n\t\temit(m[2]);\n\t\taddSep(m[3]);\n\t\treturn true;\n\t});\n\tdt.addHandler('div', (node) => {\n\t\tif (node.classList.contains('magnify') &&\n\t\t\tnode.parentNode &&\n\t\t\tnode.parentNode.classList.contains('thumbcaption')) {\n\t\t\t// Skip the \"magnify\" link, which PHP has and Parsoid doesn't.\n\t\t\treturn node.nextSibling;\n\t\t}\n\t\treturn true;\n\t});\n\t/* These are the block elements which we delimit with newlines (aka,\n\t * we ensure they start on a line of their own). */\n\tvar forceBreak = () => { addSep('\\n'); return true; };\n\tfor (const el of ['p','li','div','table','tr','h1','h2','h3','h4','h5','h6','figure', 'figcaption']) {\n\t\tdt.addHandler(el, forceBreak);\n\t}\n\tdt.addHandler('div', (node) => {\n\t\tif (node.classList.contains('thumbcaption')) {\n\t\t\t// <figcaption> (Parsoid) is marked as forceBreak,\n\t\t\t// so thumbcaption (PHP) should be, too.\n\t\t\tforceBreak();\n\t\t}\n\t\treturn true;\n\t});\n\t/* Separate table columns with spaces */\n\tdt.addHandler('td', () => { addSep(' '); return true; });\n\t/* Suppress reference numbers and linkback text */\n\tdt.addHandler('sup', (node) => {\n\t\tif (\n\t\t\tnode.classList.contains('reference') /* PHP */ ||\n\t\t\tnode.classList.contains('mw-ref') /* Parsoid */\n\t\t) {\n\t\t\treturn node.nextSibling; // Skip contents of this node\n\t\t}\n\t\treturn true;\n\t});\n\tdt.addHandler('span', (node) => {\n\t\tif (\n\t\t\tnode.classList.contains('mw-cite-backlink') ||\n\t\t\t/\\bmw:referencedBy\\b/.test(node.getAttribute('rel') || '')\n\t\t) {\n\t\t\treturn node.nextSibling; // Skip contents of this node\n\t\t}\n\t\treturn true;\n\t});\n\tdt.addHandler('figcaption', (node) => {\n\t\t/* Captions are suppressed in PHP for:\n\t\t * figure[typeof~=\"mw:Image/Frameless\"], figure[typeof~=\"mw:Image\"]\n\t\t * See Note 5 of https://www.mediawiki.org/wiki/Specs/HTML/1.7.0#Images\n\t\t */\n\t\tif (DOMDataUtils.hasTypeOf(node.parentNode, 'mw:Image/Frameless') ||\n\t\t\tDOMDataUtils.hasTypeOf(node.parentNode, 'mw:Image')) {\n\t\t\t// Skip caption contents, since they don't appear in PHP output.\n\t\t\treturn node.nextSibling;\n\t\t}\n\t\treturn true;\n\t});\n\t/* Show the targets of wikilinks, since the titles should be\n\t * language-converted too. */\n\tdt.addHandler('a', (node) => {\n\t\tconst rel = node.getAttribute('rel') || '';\n\t\tif (/\\bmw:referencedBy\\b/.test(rel)) {\n\t\t\t// skip reference linkback\n\t\t\treturn node.nextSibling;\n\t\t}\n\t\tlet href = node.getAttribute('href') || '';\n\t\t// Rewrite red links as normal links\n\t\tlet m = /^\\/w\\/index\\.php\\?title=(.*?)&.*redlink=1$/.exec(href);\n\t\tif (m) {\n\t\t\thref = `/wiki/${m[1]}`;\n\t\t}\n\t\t// Local links to this page, or self-links\n\t\tm = /^#/.test(href);\n\t\tif (m || node.classList.contains('mw-selflink')) {\n\t\t\tconst title = encodeURIComponent(env.page.name);\n\t\t\thref = `/wiki/${title}${href}`;\n\t\t}\n\t\t// Now look for wiki links\n\t\tif (node.classList.contains('external')) {\n\t\t\treturn true;\n\t\t}\n\t\tif (/^(\\.\\.?|\\/wiki)\\//.test(href)) {\n\t\t\tconst title = hrefToTitle(href);\n\t\t\taddSep(' ');\n\t\t\temit(`[${title}]`);\n\t\t\taddSep(' ');\n\t\t}\n\t\treturn true;\n\t});\n\tdt.addHandler('link', (node) => {\n\t\tconst rel = node.getAttribute('rel') || '';\n\t\tif (/\\bmw:PageProp\\/redirect\\b/.test(rel)) {\n\t\t\t// Given Parsoid output, emulate PHP output for redirects.\n\t\t\tforceBreak();\n\t\t\temit('Redirect to:');\n\t\t\tforceBreak();\n\t\t\tconst title = nodeHrefToTitle(node);\n\t\t\temit(`[${title}]`);\n\t\t\taddSep(' ');\n\t\t\temit(title);\n\t\t\treturn node.nextSibling;\n\t\t}\n\t\treturn true;\n\t});\n\tdt.traverse(document.body);\n\treturn buf;\n};\n\n// Wrap an asynchronous function in code to record/replay network requests\nconst nocksWrap = function(f) {\n\treturn Promise.async(function *(domain, title, lang, options, formatter) {\n\t\tlet nock, dir, nocksFile;\n\t\tif (options.record || options.replay) {\n\t\t\tdir = path.resolve(__dirname, '../nocks/');\n\t\t\tif (!(yield fs.exists(dir))) {\n\t\t\t\tyield fs.mkdir(dir);\n\t\t\t}\n\t\t\tdir = `${dir}/${domain}`;\n\t\t\tif (!(yield fs.exists(dir))) {\n\t\t\t\tyield fs.mkdir(dir);\n\t\t\t}\n\t\t\tnocksFile = `${dir}/lc-${encodeURIComponent(title)}-${lang}.js`;\n\t\t\tif (options.record) {\n\t\t\t\tnock = require('nock');\n\t\t\t\tnock.recorder.rec({ dont_print: true });\n\t\t\t} else {\n\t\t\t\trequire(nocksFile);\n\t\t\t}\n\t\t}\n\t\ttry {\n\t\t\treturn (yield f(domain, title, lang, options, formatter));\n\t\t} finally {\n\t\t\tif (options.record) {\n\t\t\t\tconst nockCalls = nock.recorder.play();\n\t\t\t\tyield fs.writeFile(\n\t\t\t\t\tnocksFile,\n\t\t\t\t\t`'use strict';\\nlet nock = require('nock');\\n${nockCalls.join('\\n')}`,\n\t\t\t\t\t'utf8'\n\t\t\t\t);\n\t\t\t\tnock.recorder.clear();\n\t\t\t\tnock.restore();\n\t\t\t}\n\t\t}\n\t});\n};\n\nconst runTest = nocksWrap(Promise.async(function *(domain, title, lang, options, formatter) {\n\t// Step 0: Configuration & setup\n\tconst parsoidOptions = {\n\t\tloadWMF: true,\n\t};\n\tconst envOptions = {\n\t\tdomain,\n\t\tpageName: title,\n\t\tuserAgent: 'LangConvTest',\n\t\twtVariantLanguage: options.sourceVariant || null,\n\t\thtmlVariantLanguage: lang || null,\n\t\tlogLevels: options.verbose ? undefined : [\"fatal\", \"error\", \"warn\"],\n\t};\n\tScriptUtils.setTemplatingAndProcessingFlags(parsoidOptions, options);\n\tScriptUtils.setDebuggingFlags(parsoidOptions, options);\n\tScriptUtils.setColorFlags(options);\n\n\tconst parsoidConfig = new ParsoidConfig(null, parsoidOptions);\n\tconst env = yield MWParserEnvironment.getParserEnv(parsoidConfig, envOptions);\n\n\t// Step 1: Fetch page from PHP API\n\tconst phpDoc = yield phpFetch(env, title, options.oldid);\n\t// Step 2: Fetch page from Parsoid API\n\tconst parsoidDoc = yield parsoidFetch(env, title, {\n\t\tdomain,\n\t\turi: options.parsoidURL,\n\t\toldid: options.oldid || phpDoc.revid,\n\t\tuseServer: options.useServer,\n\t});\n\t// Step 3: Strip most markup (so we're comparing text, not markup)\n\t//  ...but eventually we'll leave <a href> since there's some title\n\t//    conversion that should be done.\n\tconst normalize = out => `TITLE: ${out.displaytitle}\\n\\n` +\n\t\textractText(env, out.document);\n\tconst phpText = normalize(phpDoc);\n\tconst parsoidText = normalize(parsoidDoc);\n\t// Step 4: Compare (and profit!)\n\tconsole.assert(+phpDoc.revid === +parsoidDoc.revid);\n\tconst output = formatter(null, domain, title, lang, options, {\n\t\tphp: phpText,\n\t\tparsoid: parsoidText,\n\t\trevid: phpDoc.revid,\n\t});\n\tconst exitCode = (phpText === parsoidText) ? 0 : 1;\n\n\treturn {\n\t\toutput,\n\t\texitCode,\n\t\t// List of local titles, in case we are spidering test cases\n\t\tlinkedTitles: spiderDocument(env, parsoidDoc.document),\n\t};\n}));\n\nif (require.main === module) {\n\tconst standardOpts = ScriptUtils.addStandardOptions({\n\t\tsourceVariant: {\n\t\t\tdescription: 'Force conversion to assume the given variant for' +\n\t\t\t\t' the source wikitext',\n\t\t\tboolean: false,\n\t\t\tdefault: null,\n\t\t},\n\t\tdomain: {\n\t\t\tdescription: 'Which wiki to use; e.g. \"sr.wikipedia.org\" for' +\n\t\t\t\t' Serbian wikipedia',\n\t\t\tboolean: false,\n\t\t\tdefault: 'sr.wikipedia.org',\n\t\t},\n\t\toldid: {\n\t\t\tdescription: 'Optional oldid of the given page. If not given,' +\n\t\t\t\t' will use the latest revision.',\n\t\t\tboolean: false,\n\t\t\tdefault: null,\n\t\t},\n\t\tparsoidURL: {\n\t\t\tdescription: 'The URL for the Parsoid API',\n\t\t\tboolean: false,\n\t\t\tdefault: '',\n\t\t},\n\t\tapiURL: {\n\t\t\tdescription: 'http path to remote API,' +\n\t\t\t\t' e.g. http://sr.wikipedia.org/w/api.php',\n\t\t\tboolean: false,\n\t\t\tdefault: '',\n\t\t},\n\t\txml: {\n\t\t\tdescription: 'Use xml output format',\n\t\t\tboolean: true,\n\t\t\tdefault: false,\n\t\t},\n\t\tcheck: {\n\t\t\tdescription: 'Exit with non-zero exit code if differences found using selser',\n\t\t\tboolean: true,\n\t\t\tdefault: false,\n\t\t\talias: 'c',\n\t\t},\n\t\t'record': {\n\t\t\tdescription: 'Record http requests for later replay',\n\t\t\t'boolean': true,\n\t\t\t'default': false,\n\t\t},\n\t\t'replay': {\n\t\t\tdescription: 'Replay recorded http requests for later replay',\n\t\t\t'boolean': true,\n\t\t\t'default': false,\n\t\t},\n\t\t'spider': {\n\t\t\tdescription: 'Spider <number> additional pages past the given one',\n\t\t\t'boolean': false,\n\t\t\t'default': 0,\n\t\t},\n\t\t'silent': {\n\t\t\tdescription: 'Skip output (used with --record --spider to load caches)',\n\t\t\t'boolean': true,\n\t\t\t'default': false,\n\t\t},\n\t\t'verbose': {\n\t\t\tdescription: 'Log at level \"info\" as well',\n\t\t\t'boolean': true,\n\t\t\t'default': false,\n\t\t},\n\t\t'useServer': {\n\t\t\tdescription: 'Use a parsoid server',\n\t\t\t'boolean': true,\n\t\t\t'default': false,\n\t\t},\n\t});\n\n\tPromise.async(function *() {\n\t\tconst opts = yargs\n\t\t.usage(\n\t\t\t'Usage: $0 [options] <page-title> <variantLanguage>\\n' +\n\t\t\t'The page title should be the \"true title\",' +\n\t\t\t'i.e., without any url encoding which might be necessary if it appeared in wikitext.' +\n\t\t\t'\\n\\n'\n\t\t)\n\t\t.options(standardOpts)\n\t\t.strict();\n\t\tconst argv = opts.argv;\n\t\tif (!argv._.length) {\n\t\t\treturn opts.showHelp();\n\t\t}\n\t\tconst title = String(argv._[0]);\n\t\tconst lang = String(argv._[1]);\n\t\tlet ret = null;\n\t\tif (argv.record || argv.replay) {\n\t\t\t// Don't fork a separate server if record/replay\n\t\t\targv.useServer = false;\n\t\t}\n\t\tif (argv.useServer && !argv.parsoidURL) {\n\t\t\t// Start our own Parsoid server\n\t\t\tconst serviceWrapper = require('../tests/serviceWrapper.js');\n\t\t\tconst serverOpts = {\n\t\t\t\tlogging: { level: 'info' },\n\t\t\t};\n\t\t\tif (argv.apiURL) {\n\t\t\t\tserverOpts.mockURL = argv.apiURL;\n\t\t\t\targv.domain = 'customwiki';\n\t\t\t} else {\n\t\t\t\tserverOpts.skipMock = true;\n\t\t\t}\n\t\t\tret = yield serviceWrapper.runServices(serverOpts);\n\t\t\targv.parsoidURL = ret.parsoidURL;\n\t\t}\n\t\tconst formatter =\n\t\t\tScriptUtils.booleanOption(argv.silent) ? silentFormat :\n\t\t\tScriptUtils.booleanOption(argv.xml) ? xmlFormat :\n\t\t\tplainFormat;\n\t\tconst domain = argv.domain || 'sr.wikipedia.org';\n\t\tconst queue = [title];\n\t\tconst titlesDone = new Set();\n\t\tlet exitCode = 0;\n\t\tlet r;\n\t\tfor (let i = 0; i < queue.length; i++) {\n\t\t\tif (titlesDone.has(queue[i])) {\n\t\t\t\tcontinue; // duplicate title\n\t\t\t}\n\t\t\tif (argv.spider > 1 && argv.verbose) {\n\t\t\t\tconsole.log('%s (%d/%d)', queue[i], titlesDone.size, argv.spider);\n\t\t\t}\n\t\t\ttry {\n\t\t\t\tr = yield runTest(domain, queue[i], lang, argv, formatter);\n\t\t\t} catch (e) {\n\t\t\t\tif (e instanceof DoesNotExistError && argv.spider > 1) {\n\t\t\t\t\t// Ignore page-not-found if we are spidering.\n\t\t\t\t\tcontinue;\n\t\t\t\t}\n\t\t\t\tr = {\n\t\t\t\t\terror: true,\n\t\t\t\t\toutput: formatter(e, domain, queue[i], lang, argv),\n\t\t\t\t\texitCode: 2,\n\t\t\t\t};\n\t\t\t}\n\t\t\texitCode = Math.max(exitCode, r.exitCode);\n\t\t\tif (r.output) {\n\t\t\t\tconsole.log(r.output);\n\t\t\t}\n\t\t\t// optionally, spider\n\t\t\tif (argv.spider > 1) {\n\t\t\t\tif (!r.error) {\n\t\t\t\t\ttitlesDone.add(queue[i]);\n\t\t\t\t\tfor (const t of r.linkedTitles) {\n\t\t\t\t\t\tif (/:/.test(t)) { continue; /* hack: no namespaces */ }\n\t\t\t\t\t\tqueue.push(t);\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t\tif (titlesDone.size >= argv.spider) {\n\t\t\t\t\tbreak; /* done! */\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t\tif (ret !== null) {\n\t\t\tyield ret.runner.stop();\n\t\t}\n\t\tif (argv.check || exitCode > 1) {\n\t\t\tprocess.exit(exitCode);\n\t\t}\n\t})().done();\n} else if (typeof module === 'object') {\n\tmodule.exports.runTest = runTest;\n\n\tmodule.exports.jsonFormat = jsonFormat;\n\tmodule.exports.plainFormat = plainFormat;\n\tmodule.exports.xmlFormat = xmlFormat;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/bin/normalize.test.js","messages":[{"ruleId":"no-process-exit","severity":2,"message":"Don't use process.exit(); throw an error instead.","line":66,"column":2,"nodeType":"CallExpression","messageId":"noProcessExit","endLine":66,"endColumn":17}],"errorCount":1,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"source":"#!/usr/bin/env node\n\n'use strict';\n\nrequire('../core-upgrade.js');\n\nvar DOMNormalizer = require('../lib/html2wt/DOMNormalizer.js').DOMNormalizer;\nvar ContentUtils = require('../lib/utils/ContentUtils.js').ContentUtils;\nvar Promise = require('../lib/utils/promise.js');\nvar ScriptUtils = require('../tools/ScriptUtils.js').ScriptUtils;\nvar MockEnv = require('../tests/MockEnv.js').MockEnv;\n\nvar yargs = require('yargs');\nvar fs = require('pn/fs');\n\nvar opts = yargs\n.usage(\"Usage: $0 [options] [html-file]\\n\\nProvide either inline html OR 1 file\")\n.options({\n\thelp: {\n\t\tdescription: 'Show this message',\n\t\t'boolean': true,\n\t\t'default': false,\n\t},\n\tenableSelserMode: {\n\t\tdescription: [\n\t\t\t'Run in selser mode (but dom-diff markers are not loaded).',\n\t\t\t'This just forces more normalization code to run.',\n\t\t\t'So, this is \"fake selser\" mode till we are able to load diff markers from attributes'\n\t\t].join(' '),\n\t\t'boolean': true,\n\t\t'default': false,\n\t},\n\thtml: {\n\t\tdescription: 'html',\n\t\t'boolean': false,\n\t\t'default': '',\n\t},\n});\n\nPromise.async(function *() {\n\tvar argv = opts.argv;\n\tvar html = argv.html;\n\tif (!html && argv._[0]) {\n\t\thtml = yield fs.readFile(argv._[0], 'utf8');\n\t}\n\n\tif (ScriptUtils.booleanOption(argv.help) || !html) {\n\t\topts.showHelp();\n\t\treturn;\n\t}\n\n\tconst env = new MockEnv({\n\t\tscrubWikitext: true,\n\t}, null);\n\n\tvar mockState = {\n\t\tenv,\n\t\tselserMode: argv.enableSelserMode\n\t};\n\n\tconst domBody = ContentUtils.ppToDOM(env, html, { markNew: true });\n\tconst normalizedBody = (new DOMNormalizer(mockState).normalize(domBody));\n\n\tContentUtils.dumpDOM(normalizedBody, 'Normalized DOM', { env: mockState.env, storeDiffMark: true });\n\n\tprocess.exit(0);\n})().done();\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/bin/roundtrip-test.js","messages":[{"ruleId":"no-process-exit","severity":2,"message":"Don't use process.exit(); throw an error instead.","line":926,"column":4,"nodeType":"CallExpression","messageId":"noProcessExit","endLine":926,"endColumn":28}],"errorCount":1,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"source":"#!/usr/bin/env node\n\n'use strict';\n\nrequire('../core-upgrade.js');\nrequire('colors');\nconst { htmlDiff } = require('./diff.html.js');\n\nvar entities = require('entities');\nvar fs = require('fs');\nvar yargs = require('yargs');\nvar zlib = require('pn/zlib');\n\nvar Promise = require('../lib/utils/promise.js');\nvar Util = require('../lib/utils/Util.js').Util;\nvar ScriptUtils = require('../tools/ScriptUtils.js').ScriptUtils;\nvar ContentUtils = require('../lib/utils/ContentUtils.js').ContentUtils;\nvar DOMUtils = require('../lib/utils/DOMUtils.js').DOMUtils;\nvar DOMDataUtils = require('../lib/utils/DOMDataUtils.js').DOMDataUtils;\nvar TestUtils = require('../tests/TestUtils.js').TestUtils;\nvar WTUtils = require('../lib/utils/WTUtils.js').WTUtils;\nvar ParsoidConfig = require('../lib/config/ParsoidConfig.js').ParsoidConfig;\nvar Diff = require('../lib/utils/Diff.js').Diff;\nvar JSUtils = require('../lib/utils/jsutils.js').JSUtils;\nvar MockEnv = require('../tests/MockEnv.js').MockEnv;\n\nvar defaultContentVersion = '2.2.0';\n\nfunction displayDiff(type, count) {\n\tvar pad = (10 - type.length);  // Be positive!\n\ttype = type[0].toUpperCase() + type.substr(1);\n\treturn type + ' differences' + ' '.repeat(pad) + ': ' + count + '\\n';\n}\n\nvar jsonFormat = function(error, prefix, title, results, profile) {\n\tvar diffs = {\n\t\thtml2wt: { semantic: 0, syntactic: 0 },\n\t\tselser: { semantic: 0, syntactic: 0 },\n\t};\n\tif (!error) {\n\t\tresults.forEach(function(result) {\n\t\t\tvar mode = diffs[result.selser ? 'selser' : 'html2wt'];\n\t\t\tmode[result.type === 'fail' ? 'semantic' : 'syntactic']++;\n\t\t});\n\t}\n\treturn {\n\t\terror: error,\n\t\tresults: diffs,\n\t};\n};\n\nvar plainFormat = function(err, prefix, title, results, profile) {\n\tvar testDivider = '='.repeat(70) + '\\n';\n\tvar diffDivider = '-'.repeat(70) + '\\n';\n\tvar output = '';\n\n\tif (err) {\n\t\toutput += 'Parser failure!\\n\\n';\n\t\toutput += diffDivider;\n\t\toutput += err;\n\t\tif (err.stack) {\n\t\t\toutput += '\\nStack trace: ' + err.stack;\n\t\t}\n\t} else {\n\t\tvar diffs = {\n\t\t\thtml2wt: { semantic: 0, syntactic: 0 },\n\t\t\tselser: { semantic: 0, syntactic: 0 },\n\t\t};\n\t\tfor (var i = 0; i < results.length; i++) {\n\t\t\tvar result = results[i];\n\t\t\toutput += testDivider;\n\t\t\tif (result.type === 'fail') {\n\t\t\t\toutput += 'Semantic difference' +\n\t\t\t\t\t(result.selser ? ' (selser)' : '') + ':\\n\\n';\n\t\t\t\toutput += result.wtDiff + '\\n';\n\t\t\t\toutput += diffDivider + 'HTML diff:\\n\\n' +\n\t\t\t\t\tresult.htmlDiff + '\\n';\n\t\t\t\tdiffs[result.selser ? 'selser' : 'html2wt'].semantic++;\n\t\t\t} else {\n\t\t\t\toutput += 'Syntactic difference' +\n\t\t\t\t\t(result.selser ? ' (selser)' : '') + ':\\n\\n';\n\t\t\t\toutput += result.wtDiff + '\\n';\n\t\t\t\tdiffs[result.selser ? 'selser' : 'html2wt'].syntactic++;\n\t\t\t}\n\t\t}\n\t\toutput += testDivider;\n\t\toutput += testDivider;\n\t\toutput += 'SUMMARY:\\n';\n\t\toutput += diffDivider;\n\t\tvar total = 0;\n\t\tObject.keys(diffs).forEach(function(diff) {\n\t\t\toutput += diff + '\\n';\n\t\t\toutput += diffDivider;\n\t\t\tObject.keys(diffs[diff]).forEach(function(type) {\n\t\t\t\tvar count = diffs[diff][type];\n\t\t\t\ttotal += count;\n\t\t\t\toutput += displayDiff(type, count);\n\t\t\t});\n\t\t\toutput += diffDivider;\n\t\t});\n\t\toutput += displayDiff('all', total);\n\t\toutput += testDivider;\n\t\toutput += testDivider;\n\t}\n\n\treturn output;\n};\n\nvar xmlFormat = function(err, prefix, title, results, profile) {\n\tvar i, result;\n\tvar article = Util.escapeHtml(prefix + ':' + title);\n\tvar output = '<testsuites>\\n';\n\tvar outputTestSuite = function(selser) {\n\t\toutput += '<testsuite name=\"Roundtrip article ' + article;\n\t\tif (selser) {\n\t\t\toutput += ' (selser)';\n\t\t}\n\t\toutput += '\">\\n';\n\t};\n\n\tif (err) {\n\t\toutputTestSuite(false);\n\t\toutput += '<testcase name=\"entire article\">';\n\t\toutput += '<error type=\"parserFailedToFinish\">';\n\t\toutput += Util.escapeHtml(err.stack || err.toString());\n\t\toutput += '</error></testcase>';\n\t} else if (!results.length) {\n\t\toutputTestSuite(false);\n\t} else {\n\t\tvar currentSelser = results[0].selser;\n\t\toutputTestSuite(currentSelser);\n\t\tfor (i = 0; i < results.length; i++) {\n\t\t\tresult = results[i];\n\n\t\t\t// When going from normal to selser results, switch to a new\n\t\t\t// test suite.\n\t\t\tif (currentSelser !== result.selser) {\n\t\t\t\toutput += '</testsuite>\\n';\n\t\t\t\tcurrentSelser = result.selser;\n\t\t\t\toutputTestSuite(currentSelser);\n\t\t\t}\n\n\t\t\toutput += '<testcase name=\"' + article;\n\t\t\toutput += ' character ' + result.offset[0].start + '\">\\n';\n\n\t\t\tif (result.type === 'fail') {\n\t\t\t\toutput += '<failure type=\"significantHtmlDiff\">\\n';\n\n\t\t\t\toutput += '<diff class=\"wt\">\\n';\n\t\t\t\toutput += Util.escapeHtml(result.wtDiff);\n\t\t\t\toutput += '\\n</diff>\\n';\n\n\t\t\t\toutput += '<diff class=\"html\">\\n';\n\t\t\t\toutput += Util.escapeHtml(result.htmlDiff);\n\t\t\t\toutput += '\\n</diff>\\n';\n\n\t\t\t\toutput += '</failure>\\n';\n\t\t\t} else {\n\t\t\t\toutput += '<skipped type=\"insignificantWikitextDiff\">\\n';\n\t\t\t\toutput += Util.escapeHtml(result.wtDiff);\n\t\t\t\toutput += '\\n</skipped>\\n';\n\t\t\t}\n\n\t\t\toutput += '</testcase>\\n';\n\t\t}\n\t}\n\toutput += '</testsuite>\\n';\n\n\t// Output the profiling data\n\tif (profile) {\n\t\t// Delete the start time to avoid serializing it\n\t\tif (profile.time && profile.time.start) {\n\t\t\tdelete profile.time.start;\n\t\t}\n\t\toutput += '<perfstats>\\n';\n\t\tObject.keys(profile).forEach(function(type) {\n\t\t\tObject.keys(profile[type]).forEach(function(prop) {\n\t\t\t\toutput += '<perfstat type=\"' + TestUtils.encodeXml(type) + ':';\n\t\t\t\toutput += TestUtils.encodeXml(prop);\n\t\t\t\toutput += '\">';\n\t\t\t\toutput += TestUtils.encodeXml(profile[type][prop].toString());\n\t\t\t\toutput += '</perfstat>\\n';\n\t\t\t});\n\t\t});\n\t\toutput += '</perfstats>\\n';\n\t}\n\toutput += '</testsuites>';\n\n\treturn output;\n};\n\n// Find the subset of leaf/non-leaf nodes whose DSR ranges\n// span the wikitext range provided as input.\nvar findMatchingNodes = function(node, range) {\n\tconsole.assert(DOMUtils.isElt(node));\n\n\t// Skip subtrees that are outside our target range\n\tvar dp = DOMDataUtils.getDataParsoid(node);\n\tif (!Util.isValidDSR(dp.dsr) || dp.dsr[0] > range.end || dp.dsr[1] < range.start) {\n\t\treturn [];\n\t}\n\n\t// If target range subsumes the node, we are done.\n\tif (dp.dsr[0] >= range.start && dp.dsr[1] <= range.end) {\n\t\treturn [node];\n\t}\n\n\t// Cannot inspect template content subtree at a finer grained level\n\tif (WTUtils.isFirstEncapsulationWrapperNode(node)) {\n\t\treturn [node];\n\t}\n\n\t// Cannot inspect image subtree at a finer grained level\n\tvar typeOf = node.getAttribute('typeof') || '';\n\tif (/\\bmw:Image(\\/|\\s|$)/.test(typeOf) && /^(FIGURE|SPAN)$/.test(node.nodeName)) {\n\t\treturn [node];\n\t}\n\n\t// We are in the target range -- examine children.\n\t// 1. Walk past nodes that are before our desired range.\n\t// 2. Collect nodes within our desired range.\n\t// 3. Stop walking once you move beyond the desired range.\n\tvar elts = [];\n\tvar offset = dp.dsr[0];\n\tvar c = node.firstChild;\n\twhile (c) {\n\t\tif (DOMUtils.isElt(c)) {\n\t\t\tdp = DOMDataUtils.getDataParsoid(c);\n\t\t\tvar dsr = dp.dsr;\n\t\t\tif (Util.isValidDSR(dsr)) {\n\t\t\t\tif (dsr[1] >= range.start) {\n\t\t\t\t\t// We have an overlap!\n\t\t\t\t\telts = elts.concat(findMatchingNodes(c, range));\n\t\t\t\t}\n\t\t\t\toffset = dp.dsr[1];\n\t\t\t} else {\n\t\t\t\t// SSS FIXME: This is defensive coding here.\n\t\t\t\t//\n\t\t\t\t// This should not happen really anymore.\n\t\t\t\t// DSR computation is fairly solid now and\n\t\t\t\t// shouldn't be leaving holes.\n\t\t\t\t//\n\t\t\t\t// If we see no errors in rt-testing runs,\n\t\t\t\t// I am going to rip this out.\n\n\t\t\t\tconsole.log(\"error/diff\", \"Bad dsr for \" + c.nodeName + \": \"\n\t\t\t\t\t+ c.outerHTML.substr(0, 50));\n\n\t\t\t\tif (dp.dsr && typeof (dsr[1]) === 'number') {\n\t\t\t\t\t// We can cope in this case\n\t\t\t\t\tif (dsr[1] >= range.start) {\n\t\t\t\t\t\t// Update dsr[0]\n\t\t\t\t\t\tdp.dsr[0] = offset;\n\n\t\t\t\t\t\t// We have an overlap!\n\t\t\t\t\t\telts = elts.concat(findMatchingNodes(c, range));\n\t\t\t\t\t}\n\t\t\t\t\toffset = dp.dsr[1];\n\t\t\t\t} else if (offset >= range.start) {\n\t\t\t\t\t// Swallow it wholesale rather than try\n\t\t\t\t\t// to find finer-grained matches in the subtree\n\t\t\t\t\telts.push(c);\n\n\t\t\t\t\t// offset will now be out-of-sync till we hit\n\t\t\t\t\t// another element with a valid DSR[1] value.\n\t\t\t\t}\n\t\t\t}\n\t\t} else {\n\t\t\tvar len = DOMUtils.isText(c) ? c.nodeValue.length : WTUtils.decodedCommentLength(c);\n\t\t\tif (offset + len >= range.start) {\n\t\t\t\t// We have an overlap!\n\t\t\t\telts.push(c);\n\t\t\t}\n\t\t\toffset += len;\n\t\t}\n\n\t\t// All done!\n\t\tif (offset > range.end) {\n\t\t\tbreak;\n\t\t}\n\n\t\t// Skip over encapsulated content\n\t\tif (WTUtils.isFirstEncapsulationWrapperNode(c)) {\n\t\t\tc = WTUtils.skipOverEncapsulatedContent(c);\n\t\t} else {\n\t\t\tc = c.nextSibling;\n\t\t}\n\t}\n\n\treturn elts;\n};\n\nvar getMatchingHTML = function(body, offsetRange, nlDiffs) {\n\t// If the diff context straddles a template boundary (*) and if\n\t// the HTML context includes the template content in only one\n\t// the new/old DOMs, we can falsely flag this as a semantic\n\t// diff. To improve the possibility of including the template\n\t// content in both DOMs, expand range at both ends by 1 char.\n\t//\n\t// (*) This happens because our P-wrapping code occasionally\n\t//     swallows newlines into template context.\n\t// See https://phabricator.wikimedia.org/T89628\n\tif (nlDiffs) {\n\t\toffsetRange.start -= 1;\n\t\toffsetRange.end += 1;\n\t}\n\n\tvar html = '';\n\tvar out = findMatchingNodes(body, offsetRange);\n\tfor (var i = 0; i < out.length; i++) {\n\t\t// node need not be an element always!\n\t\tconst node = out[i];\n\t\tDOMDataUtils.visitAndStoreDataAttribs(node);\n\t\thtml += ContentUtils.toXML(node, { smartQuote: false });\n\t\tDOMDataUtils.visitAndLoadDataAttribs(node);\n\t}\n\thtml = TestUtils.normalizeOut(html);\n\n\t// Normalize away <br/>'s added by Parsoid because of newlines in wikitext.\n\t// Do this always, not just when nlDiffs is true, because newline diffs\n\t// can show up at extremities of other wt diffs.\n\treturn html.replace(/<p>\\s*<br\\s*\\/?>\\s*/g, '<p>').replace(/<p><\\/p>/g, '').replace(/(^\\s+|\\s+$)/g, '');\n};\n\n/* This doesn't try to do a really thorough job of normalization and misses a number\n * of scenarios, for example, anywhere where sol-transparent markup like comments,\n * noinclude, category links, etc. are present.\n *\n * On the flip side, it can occasionally do incorrect normalization when this markup\n * is present in extension blocks (nowiki, syntaxhighlight, etc.) where this text\n * is not really interpreted as wikitext.\n */\nfunction normalizeWikitext(wt, opts) {\n\tif (opts.preDiff) {\n\t\t// Whitespace in ordered, unordered, definition lists\n\t\t// Whitespace in first table cell/header, row, and caption\n\t\twt = wt.replace(/^([*#:;]|\\|[-+|]?|!!?)[ \\t]*(.*?)[ \\t]*$/mg, \"$1$1\");\n\n\t\t// Whitespace in headings\n\t\twt = wt.replace(/^(=+)[ \\t]*([^\\n]*?)[ \\t]*(=+)[ \\t]*$/mg, \"$1$2$3\");\n\t}\n\n\tif (opts.newlines) {\n\t\t// Normalize newlines before/after headings\n\t\twt = wt.replace(/\\n*(\\n=[^\\n]*=$\\n)\\n*/mg, \"$1\");\n\n\t\t// Normalize newlines before lists\n\t\twt = wt.replace(/(^[^*][^\\n]*$\\n)\\n+([*])/mg, \"$1$2\");\n\t\twt = wt.replace(/(^[^#][^\\n]*$\\n)\\n+([#])/mg, \"$1$2\");\n\t\twt = wt.replace(/(^[^:][^\\n]*$\\n)\\n+([:])/mg, \"$1$2\");\n\t\twt = wt.replace(/(^[^;][^\\n]*$\\n)\\n+([;])/mg, \"$1$2\");\n\n\t\t// Normalize newlines after lists\n\t\twt = wt.replace(/(^[*][^\\n]*$\\n)\\n+([^*])/mg, \"$1$2\");\n\t\twt = wt.replace(/(^[#][^\\n]*$\\n)\\n+([^#])/mg, \"$1$2\");\n\t\twt = wt.replace(/(^[:][^\\n]*$\\n)\\n+([^:])/mg, \"$1$2\");\n\t\twt = wt.replace(/(^[;][^\\n]*$\\n)\\n+([^;])/mg, \"$1$2\");\n\n\t\t// Normalize newlines before/after tables\n\t\twt = wt.replace(/\\n+(\\n{\\|)/mg, \"$1\");\n\t\twt = wt.replace(/(\\|}\\n)\\n+/mg, \"$1\");\n\n\t\t// Strip leading & trailing newlines\n\t\twt = wt.replace(/^\\n+|\\n$/, '');\n\t}\n\n\tif (opts.postDiff) {\n\t\t// Ignore leading tabs vs. leading spaces\n\t\twt = wt.replace(/^\\t/, ' ');\n\t\twt = wt.replace(/\\n\\t/g, '\\n ');\n\t\t// Normalize multiple spaces to single space\n\t\twt = wt.replace(/ +/g, ' ');\n\t\t// Ignore capitalization of tags and void tag indications\n\t\twt = wt.replace(/<(\\/?)([^ >\\/]+)((?:[^>\\/]|\\/(?!>))*)\\/?>/g,\n\t\t\tfunction(match, close, name, remaining) {\n\t\t\t\treturn '<' + close + name.toLowerCase() +\n\t\t\t\t\tremaining.replace(/ $/, '') + '>';\n\t\t\t});\n\t\t// Ignore whitespace in table cell attributes\n\t\twt = wt.replace(/(^|\\n|\\|(?=\\|)|!(?=!))(\\{\\||\\|[\\-+]*|!) *([^|\\n]*?) *(?=[|\\n]|$)/g, '$1$2$3');\n\t\t// Ignore trailing semicolons and spaces in style attributes\n\t\twt = wt.replace(/style\\s*=\\s*\"[^\"]+\"/g, function(match) {\n\t\t\treturn match.replace(/\\s|;(?=\")/g, '');\n\t\t});\n\t\t// Strip double-quotes\n\t\twt = wt.replace(/\"([^\"]*?)\"/g, '$1');\n\t\t// Ignore implicit </small> and </center> in table cells or the end\n\t\t// of the wting for now\n\t\twt = wt.replace(/(^|\\n)<\\/(?:small|center)>(?=\\n[|!]|\\n?$)/g, '');\n\t\twt = wt.replace(/([|!].*?)<\\/(?:small|center)>(?=\\n[|!]|\\n?$)/gi, '$1');\n\t}\n\n\treturn wt;\n}\n\n// Get diff substrings from offsets\nvar formatDiff = function(oldWt, newWt, offset, context) {\n\treturn [\n\t\t'------',\n\t\toldWt.substring(offset[0].start - context, offset[0].start).blue +\n\t\toldWt.substring(offset[0].start, offset[0].end).green +\n\t\toldWt.substring(offset[0].end, offset[0].end + context).blue,\n\t\t'++++++',\n\t\tnewWt.substring(offset[1].start - context, offset[1].start).blue +\n\t\tnewWt.substring(offset[1].start, offset[1].end).red +\n\t\tnewWt.substring(offset[1].end, offset[1].end + context).blue,\n\t].join('\\n');\n};\n\nfunction stripElementIds(node) {\n\twhile (node) {\n\t\tif (DOMUtils.isElt(node)) {\n\t\t\tvar id = node.getAttribute('id') || '';\n\t\t\tif (/^mw[\\w-]{2,}$/.test(id)) {\n\t\t\t\tnode.removeAttribute('id');\n\t\t\t}\n\t\t\tif (node.firstChild) {\n\t\t\t\tstripElementIds(node.firstChild);\n\t\t\t}\n\t\t}\n\t\tnode = node.nextSibling;\n\t}\n}\n\nfunction genSyntacticDiffs(data) {\n\t// Do another diff without normalizations\n\n\tvar results = [];\n\tvar diff = Diff.diffLines(data.oldWt, data.newWt);\n\tvar offsets = Diff.convertDiffToOffsetPairs(diff, data.oldLineLengths, data.newLineLengths);\n\tfor (var i = 0; i < offsets.length; i++) {\n\t\tvar offset = offsets[i];\n\t\tresults.push({\n\t\t\ttype: 'skip',\n\t\t\toffset: offset,\n\t\t\twtDiff: formatDiff(data.oldWt, data.newWt, offset, 0),\n\t\t});\n\t}\n\treturn results;\n}\n\nvar checkIfSignificant = function(offsets, data) {\n\tvar oldWt = data.oldWt;\n\tvar newWt = data.newWt;\n\n\tconst dummyEnv = new MockEnv({}, null);\n\n\tvar oldBody = dummyEnv.createDocument(data.oldHTML.body).body;\n\tvar newBody = dummyEnv.createDocument(data.newHTML.body).body;\n\n\t// Merge pagebundles so that HTML nodes can be compared and diff'ed.\n\tDOMDataUtils.applyPageBundle(oldBody.ownerDocument, {\n\t\tparsoid: data.oldDp.body,\n\t\tmw: data.oldMw && data.oldMw.body,\n\t});\n\tDOMDataUtils.applyPageBundle(newBody.ownerDocument, {\n\t\tparsoid: data.newDp.body,\n\t\tmw: data.newMw && data.newMw.body,\n\t});\n\n\t// Strip 'mw..' ids from the DOMs. This matters for 2 scenarios:\n\t// * reduces noise in visual diffs\n\t// * all other things being equal after normalization, we don't\n\t//   assume DOMs are different simply because ids are different\n\tstripElementIds(oldBody.ownerDocument.body);\n\tstripElementIds(newBody.ownerDocument.body);\n\n\t// Strip section tags from the DOMs\n\tContentUtils.stripSectionTagsAndFallbackIds(oldBody.ownerDocument.body);\n\tContentUtils.stripSectionTagsAndFallbackIds(newBody.ownerDocument.body);\n\n\tvar i, offset;\n\tvar results = [];\n\t// Use the full tests for fostered content.\n\t// Fostered/misnested content => semantic diffs.\n\tif (!/(\"|&quot;)(fostered|misnested)(\"|&quot;)\\s*:\\s*true\\b/.test(oldBody.outerHTML)) {\n\t\t// Quick test for no semantic diffs\n\t\t// If parsoid-normalized HTML for old and new wikitext is identical,\n\t\t// the wt-diffs are purely syntactic.\n\t\t//\n\t\t// FIXME: abstract to ensure same opts are used for parsoidPost and normalizeOut\n\t\tconst normOpts = { parsoidOnly: true, scrubWikitext: true };\n\t\tconst normalizedOld = TestUtils.normalizeOut(oldBody, normOpts);\n\t\tconst normalizedNew = TestUtils.normalizeOut(newBody, normOpts);\n\t\tif (normalizedOld === normalizedNew) {\n\t\t\treturn genSyntacticDiffs(data);\n\t\t} else {\n\t\t\t// Uncomment to log the cause of the failure.  This is often useful\n\t\t\t// for determining the root of non-determinism in rt.  See T151474\n\t\t\t// console.log(Diff.diffLines(normalizedOld, normalizedNew));\n\t\t}\n\t}\n\n\t// FIXME: In this code path below, the returned diffs might\n\t// underreport syntactic diffs since these are based on\n\t// diffs on normalized wikitext. Unclear how to tackle this.\n\n\t// Do this after the quick test above because in `parsoidOnly`\n\t// normalization, data-mw is not stripped.\n\tDOMDataUtils.visitAndLoadDataAttribs(oldBody);\n\tDOMDataUtils.visitAndLoadDataAttribs(newBody);\n\n\t// Now, proceed with full blown diffs\n\tfor (i = 0; i < offsets.length; i++) {\n\t\toffset = offsets[i];\n\t\tvar thisResult = { offset: offset };\n\n\t\t// Default: syntactic diff + no diff context\n\t\tthisResult.type = 'skip';\n\t\tthisResult.wtDiff = formatDiff(oldWt, newWt, offset, 0);\n\n\t\t// Is this a newline separator diff?\n\t\tvar oldStr = oldWt.substring(offset[0].start, offset[0].end);\n\t\tvar newStr = newWt.substring(offset[1].start, offset[1].end);\n\t\tvar nlDiffs = /^\\s*$/.test(oldStr) && /^\\s*$/.test(newStr)\n\t\t\t&& (/\\n/.test(oldStr) || /\\n/.test(newStr));\n\n\t\t// Check if this is really a semantic diff\n\t\tvar oldHTML = getMatchingHTML(oldBody, offset[0], nlDiffs);\n\t\tvar newHTML = getMatchingHTML(newBody, offset[1], nlDiffs);\n\t\tvar diff = Diff.patchDiff(oldHTML, newHTML);\n\t\tif (diff !== null) {\n\t\t\t// Normalize wts to check if we really have a semantic diff\n\t\t\tvar wt1 = normalizeWikitext(oldWt.substring(offset[0].start, offset[0].end), { newlines: true, postDiff: true });\n\t\t\tvar wt2 = normalizeWikitext(newWt.substring(offset[1].start, offset[1].end), { newlines: true, postDiff: true });\n\t\t\tif (wt1 !== wt2) {\n\n\t\t\t\t// Syntatic diff + provide context for semantic diffs\n\t\t\t\tthisResult.type = 'fail';\n\t\t\t\tthisResult.wtDiff = formatDiff(oldWt, newWt, offset, 25);\n\n\t\t\t\t// Don't clog the rt-test server db with humongous diffs\n\t\t\t\tif (diff.length > 2000) {\n\t\t\t\t\tdiff = diff.substring(0, 2000) + \"-- TRUNCATED TO 2000 chars --\";\n\t\t\t\t}\n\t\t\t\tthisResult.htmlDiff = diff;\n\t\t\t}\n\t\t}\n\t\tresults.push(thisResult);\n\t}\n\n\treturn results;\n};\n\nvar UA = 'Roundtrip-Test';\n\nvar parsoidPost = Promise.async(function *(profile, options) {\n\tvar httpOptions = {\n\t\tmethod: 'POST',\n\t\tbody: options.data,\n\t\theaders: {\n\t\t\t'User-Agent': UA,\n\t\t},\n\t};\n\t// For compatibility with Parsoid/PHP service\n\thttpOptions.body.offsetType = 'ucs2';\n\n\tvar uri = options.uri + 'transform/';\n\tif (options.html2wt) {\n\t\turi += 'pagebundle/to/wikitext/' + options.title;\n\t\tif (options.oldid) {\n\t\t\turi += '/' + options.oldid;\n\t\t}\n\t\thttpOptions.body.scrub_wikitext = true;\n\t\t// We want to encode the request but *not* decode the response.\n\t\thttpOptions.body = JSON.stringify(httpOptions.body);\n\t\thttpOptions.headers['Content-Type'] = 'application/json';\n\t} else {  // wt2html\n\t\turi += 'wikitext/to/pagebundle/' + options.title;\n\t\tif (options.oldid) {\n\t\t\turi += '/' + options.oldid;\n\t\t}\n\t\thttpOptions.headers.Accept = 'application/json; charset=utf-8; profile=\"https://www.mediawiki.org/wiki/Specs/pagebundle/' + options.outputContentVersion + '\"';\n\t\t// setting json here encodes the request *and* decodes the response.\n\t\thttpOptions.json = true;\n\t}\n\thttpOptions.uri = uri;\n\thttpOptions.proxy = options.proxy;\n\n\tvar result = yield ScriptUtils.retryingHTTPRequest(10, httpOptions);\n\tvar body = result[1];\n\n\t// FIXME: Parse time was removed from profiling when we stopped\n\t// sending the x-parsoid-performance header.\n\tif (options.recordSizes) {\n\t\tvar pre = '';\n\t\tif (options.profilePrefix) {\n\t\t\tpre += options.profilePrefix + ':';\n\t\t}\n\t\tvar str;\n\t\tif (options.html2wt) {\n\t\t\tpre += 'html:';\n\t\t\tstr = body;\n\t\t} else {\n\t\t\tpre += 'wt:';\n\t\t\tstr = body.html.body;\n\t\t}\n\t\tprofile.size[pre + 'raw'] = str.length;\n\t\t// Compress to record the gzipped size\n\t\tvar gzippedbuf = yield zlib.gzip(str);\n\t\tprofile.size[pre + 'gzip'] = gzippedbuf.length;\n\t}\n\treturn body;\n});\n\nfunction genLineLengths(str) {\n\treturn str.split(/^/m).map(function(l) {\n\t\treturn l.length;\n\t});\n}\n\nvar roundTripDiff = Promise.async(function *(profile, parsoidOptions, data) {\n\tvar normOpts = { preDiff: true, newlines: true };\n\n\t// Newline normalization to see if we can get to identical wt.\n\tvar wt1 = normalizeWikitext(data.oldWt, normOpts);\n\tvar wt2 = normalizeWikitext(data.newWt, normOpts);\n\tdata.oldLineLengths = genLineLengths(data.oldWt);\n\tdata.newLineLengths = genLineLengths(data.newWt);\n\tif (wt1 === wt2) {\n\t\treturn genSyntacticDiffs(data);\n\t}\n\n\t// More conservative normalization this time around\n\tnormOpts.newlines = false;\n\tvar diff = Diff.diffLines(normalizeWikitext(data.oldWt, normOpts), normalizeWikitext(data.newWt, normOpts));\n\tvar offsets = Diff.convertDiffToOffsetPairs(diff, data.oldLineLengths, data.newLineLengths);\n\tif (!offsets.length) {\n\t\t// FIXME: Can this really happen??\n\t\treturn genSyntacticDiffs(data);\n\t}\n\n\tvar contentmodel = data.contentmodel || 'wikitext';\n\tvar options = Object.assign({\n\t\twt2html: true,\n\t\tdata: { wikitext: data.newWt, contentmodel: contentmodel },\n\t}, parsoidOptions);\n\tvar body = yield parsoidPost(profile, options);\n\tdata.newHTML = body.html;\n\tdata.newDp = body['data-parsoid'];\n\tdata.newMw = body['data-mw'];\n\treturn checkIfSignificant(offsets, data);\n});\n\n// Returns a Promise for a object containing a formatted string and an\n// exitCode.\nvar runTests = Promise.async(function *(title, options, formatter) {\n\t// Only support lookups for WMF domains.  At some point we should rid\n\t// ourselves of prefixes in this file entirely, but that'll take some\n\t// coordination in rt.\n\tvar parsoidConfig = new ParsoidConfig(null, { loadWMF: true });\n\n\tvar domain = options.domain;\n\tvar prefix = options.prefix;\n\n\t// Preserve the default, but only if neither was provided.\n\tif (!prefix && !domain) { domain = 'en.wikipedia.org'; }\n\n\tif (domain && prefix) {\n\t\t// All good.\n\t} else if (!domain && prefix) {\n\t\t// Get the domain from the mw api map.\n\t\tif (parsoidConfig.mwApiMap.has(prefix)) {\n\t\t\tdomain = parsoidConfig.mwApiMap.get(prefix).domain;\n\t\t} else {\n\t\t\tthrow new Error('Couldn\\'t find the domain for prefix: ' + prefix);\n\t\t}\n\t} else if (!prefix && domain) {\n\t\t// Get the prefix from the reverse mw api map.\n\t\tprefix = parsoidConfig.getPrefixFor(domain);\n\t\tif (!prefix) {\n\t\t\t// Bogus, but `prefix` is only used for reporting.\n\t\t\tprefix = domain;\n\t\t}\n\t} else {\n\t\t// Should be unreachable.\n\t\tthrow new Error('No domain or prefix provided.');\n\t}\n\n\tconst uriOpts = options.parsoidURLOpts;\n\tlet uri = uriOpts.baseUrl;\n\tlet proxy;\n\tif (uriOpts.proxy) {\n\t\tproxy = uriOpts.proxy.host;\n\t\tif (uriOpts.proxy.port) {\n\t\t\tproxy += \":\" + uriOpts.proxy.port;\n\t\t}\n\t\t// Special support for the WMF cluster\n\t\turi = uri.replace(/DOMAIN/, domain);\n\t}\n\n\t// make sure the Parsoid URI ends on /\n\tif (!/\\/$/.test(uri)) {\n\t\turi += '/';\n\t}\n\tvar parsoidOptions = {\n\t\turi: uri + domain + '/v3/',\n\t\tproxy: proxy,\n\t\ttitle: encodeURIComponent(title),\n\t\toutputContentVersion: options.outputContentVersion || defaultContentVersion,\n\t};\n\tvar uri2 = parsoidOptions.uri + 'page/wikitext/' + parsoidOptions.title;\n\tif (options.oldid) {\n\t\turi2 += '/' + options.oldid;\n\t}\n\n\tvar profile = { time: { total: 0, start: 0 }, size: {} };\n\tvar data = {};\n\tvar error;\n\tvar exitCode;\n\ttry {\n\t\tvar opts;\n\t\tvar req = yield ScriptUtils.retryingHTTPRequest(10, {\n\t\t\tmethod: 'GET',\n\t\t\turi: uri2,\n\t\t\tproxy: proxy,\n\t\t\theaders: {\n\t\t\t\t'User-Agent': UA,\n\t\t\t},\n\t\t});\n\t\tprofile.time.start = JSUtils.startTime();\n\t\t// We may have been redirected to the latest revision.  Record the\n\t\t// oldid for later use in selser.\n\t\tdata.oldid = req[0].request.path.replace(/^(.*)\\//, '');\n\t\tdata.oldWt = req[1];\n\t\tdata.contentmodel = req[0].headers['x-contentmodel'] || 'wikitext';\n\t\t// First, fetch the HTML for the requested page's wikitext\n\t\topts = Object.assign({\n\t\t\twt2html: true,\n\t\t\trecordSizes: true,\n\t\t\tdata: { wikitext: data.oldWt, contentmodel: data.contentmodel },\n\t\t}, parsoidOptions);\n\t\tvar body = yield parsoidPost(profile, opts);\n\n\t\t// Check for wikitext redirects\n\t\tconst redirectMatch = body.html.body.match(/<link rel=\"mw:PageProp\\/redirect\" href=\"([^\"]*)\"/);\n\t\tif (redirectMatch) {\n\t\t\tconst target = Util.decodeURIComponent(entities.decodeHTML5(redirectMatch[1].replace(/^(\\.\\/)?/, '')));\n\t\t\t// Log this so we can collect these and update the database titles\n\t\t\tconsole.error(`REDIRECT: ${prefix}:${title.replace(/\"/g, '\\\\\"')} -> ${prefix}:${target.replace(/\"/g, '\\\\\"')}`);\n\t\t\treturn yield runTests(target, options, formatter);\n\t\t}\n\n\t\tdata.oldHTML = body.html;\n\t\tdata.oldDp = body['data-parsoid'];\n\t\tdata.oldMw = body['data-mw'];\n\t\t// Now, request the wikitext for the obtained HTML\n\t\topts = Object.assign({\n\t\t\thtml2wt: true,\n\t\t\trecordSizes: true,\n\t\t\tdata: {\n\t\t\t\thtml: data.oldHTML.body,\n\t\t\t\tcontentmodel: data.contentmodel,\n\t\t\t\toriginal: {\n\t\t\t\t\t'data-parsoid': data.oldDp,\n\t\t\t\t\t'data-mw': data.oldMw,\n\t\t\t\t\twikitext: { body: data.oldWt },\n\t\t\t\t},\n\t\t\t},\n\t\t}, parsoidOptions);\n\t\tdata.newWt = yield parsoidPost(profile, opts);\n\t\tdata.diffs = yield roundTripDiff(profile, parsoidOptions, data);\n\t\t// Once we have the diffs between the round-tripped wt,\n\t\t// to test rt selser we need to modify the HTML and request\n\t\t// the wt again to compare with selser, and then concat the\n\t\t// resulting diffs to the ones we got from basic rt\n\t\tvar newDocument = DOMUtils.parseHTML(data.oldHTML.body);\n\t\tvar newNode = newDocument.createComment('rtSelserEditTestComment');\n\t\tnewDocument.body.appendChild(newNode);\n\t\topts = Object.assign({\n\t\t\thtml2wt: true,\n\t\t\tuseSelser: true,\n\t\t\toldid: data.oldid,\n\t\t\tdata: {\n\t\t\t\thtml: newDocument.outerHTML,\n\t\t\t\tcontentmodel: data.contentmodel,\n\t\t\t\toriginal: {\n\t\t\t\t\t'data-parsoid': data.oldDp,\n\t\t\t\t\t'data-mw': data.oldMw,\n\t\t\t\t\twikitext: { body: data.oldWt },\n\t\t\t\t\thtml: data.oldHTML,\n\t\t\t\t},\n\t\t\t},\n\t\t\tprofilePrefix: 'selser',\n\t\t}, parsoidOptions);\n\t\tvar out = yield parsoidPost(profile, opts);\n\t\t// Finish the total time now\n\t\t// FIXME: Is the right place to end it?\n\t\tprofile.time.total = JSUtils.elapsedTime(profile.time.start);\n\t\t// Remove the selser trigger comment\n\t\tdata.newWt = out.replace(/<!--rtSelserEditTestComment-->\\n*$/, '');\n\t\tvar selserDiffs = yield roundTripDiff(profile, parsoidOptions, data);\n\t\tselserDiffs.forEach(function(diff) {\n\t\t\tdiff.selser = true;\n\t\t});\n\t\tif (selserDiffs.length) {\n\t\t\tdata.diffs = data.diffs.concat(selserDiffs);\n\t\t\texitCode = 1;\n\t\t} else {\n\t\t\texitCode = 0;\n\t\t}\n\t} catch (e) {\n\t\terror = e;\n\t\texitCode = 1;\n\t}\n\tvar output = formatter(error, prefix, title, data.diffs, profile);\n\t// write diffs to $outDir/DOMAIN/TITLE\n\tif (options.htmlDiffConfig && Math.random() < (options.htmlDiffConfig.sampleRate || 0)) {\n\t\tconst outDir = options.htmlDiffConfig.outDir || \"/tmp/htmldiffs\";\n\t\tconst dir = `${outDir}/${domain}`;\n\t\tif (!fs.existsSync(dir)) {\n\t\t\tfs.mkdirSync(dir);\n\t\t}\n\t\tconst diffs = yield htmlDiff(options.htmlDiffConfig, domain, title);\n\t\t// parsoidOptions.title is uri-encoded\n\t\tfs.writeFileSync(`${dir}/${parsoidOptions.title}`, diffs.join('\\n'));\n\t}\n\treturn {\n\t\toutput: output,\n\t\texitCode: exitCode\n\t};\n});\n\n\nif (require.main === module) {\n\tvar standardOpts = {\n\t\txml: {\n\t\t\tdescription: 'Use xml callback',\n\t\t\tboolean: true,\n\t\t\tdefault: false,\n\t\t},\n\t\tprefix: {\n\t\t\tdescription: 'Deprecated.  Please provide a domain.',\n\t\t\tboolean: false,\n\t\t\tdefault: '',\n\t\t},\n\t\tdomain: {\n\t\t\tdescription: 'Which wiki to use; e.g. \"en.wikipedia.org\" for' +\n\t\t\t\t' English wikipedia',\n\t\t\tboolean: false,\n\t\t\tdefault: '',  // Add a default when `prefix` is removed.\n\t\t},\n\t\toldid: {\n\t\t\tdescription: 'Optional oldid of the given page. If not given,' +\n\t\t\t\t' will use the latest revision.',\n\t\t\tboolean: false,\n\t\t\tdefault: null,\n\t\t},\n\t\tparsoidURL: {\n\t\t\tdescription: 'The URL for the Parsoid API',\n\t\t\tboolean: false,\n\t\t\tdefault: '',\n\t\t},\n\t\tproxyURL: {\n\t\t\tdescription: 'URL (with protocol and port, if any) for the proxy fronting Parsoid',\n\t\t\tboolean: false,\n\t\t\tdefault: null,\n\t\t},\n\t\tapiURL: {\n\t\t\tdescription: 'http path to remote API,' +\n\t\t\t\t' e.g. http://en.wikipedia.org/w/api.php',\n\t\t\tboolean: false,\n\t\t\tdefault: '',\n\t\t},\n\t\toutputContentVersion: {\n\t\t\tdescription: 'The acceptable content version.',\n\t\t\tboolean: false,\n\t\t\tdefault: defaultContentVersion,\n\t\t},\n\t\tcheck: {\n\t\t\tdescription: 'Exit with non-zero exit code if differences found using selser',\n\t\t\tboolean: true,\n\t\t\tdefault: false,\n\t\t\talias: 'c',\n\t\t},\n\t};\n\n\tPromise.async(function *() {\n\t\tvar opts = yargs\n\t\t.usage(\n\t\t\t'Usage: $0 [options] <page-title> \\n' +\n\t\t\t'The page title should be the \"true title\",' +\n\t\t\t'i.e., without any url encoding which might be necessary if it appeared in wikitext.' +\n\t\t\t'\\n\\n'\n\t\t)\n\t\t.options(standardOpts)\n\t\t.strict();\n\n\t\tvar argv = opts.argv;\n\t\tif (!argv._.length) {\n\t\t\treturn opts.showHelp();\n\t\t}\n\t\tvar title = String(argv._[0]);\n\n\t\tvar ret = null;\n\t\tif (!argv.parsoidURL) {\n\t\t\t// Start our own Parsoid server\n\t\t\tvar serviceWrapper = require('../tests/serviceWrapper.js');\n\t\t\tvar serverOpts = {\n\t\t\t\tlogging: { level: 'info' },\n\t\t\t\tparsoidOptions: {\n\t\t\t\t\tloadWMF: true,\n\t\t\t\t\tuseSelser: true,\n\t\t\t\t}\n\t\t\t};\n\t\t\tif (argv.apiURL) {\n\t\t\t\tserverOpts.mockURL = argv.apiURL;\n\t\t\t\targv.domain = 'customwiki';\n\t\t\t} else {\n\t\t\t\tserverOpts.skipMock = true;\n\t\t\t}\n\t\t\tret = yield serviceWrapper.runServices(serverOpts);\n\t\t\targv.parsoidURL = ret.parsoidURL;\n\t\t}\n\t\targv.parsoidURLOpts = { baseUrl: argv.parsoidURL };\n\t\tif (argv.proxyURL) {\n\t\t\targv.parsoidURLOpts.proxy = { host: argv.proxyURL };\n\t\t}\n\t\tvar formatter = ScriptUtils.booleanOption(argv.xml) ? xmlFormat : plainFormat;\n\t\tvar r = yield runTests(title, argv, formatter);\n\t\tconsole.log(r.output);\n\t\tif (ret !== null) {\n\t\t\tyield ret.runner.stop();\n\t\t}\n\t\tif (argv.check) {\n\t\t\tprocess.exit(r.exitCode);\n\t\t}\n\t})().done();\n} else if (typeof module === 'object') {\n\tmodule.exports.runTests = runTests;\n\n\tmodule.exports.jsonFormat = jsonFormat;\n\tmodule.exports.plainFormat = plainFormat;\n\tmodule.exports.xmlFormat = xmlFormat;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/composer.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/core-upgrade.js","messages":[{"ruleId":"no-shadow","severity":2,"message":"'msg' is already declared in the upper scope.","line":34,"column":17,"nodeType":"Identifier","messageId":"noShadow","endLine":34,"endColumn":20}],"errorCount":1,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"source":"'use strict';\n\n// Register prfun's Promises with node-pn\nvar Promise = require('./lib/utils/promise.js');\nrequire('pn/_promise')(Promise); // This only needs to be done once.\n\n// Comments below annotate the highest lts version of node for which the\n// polyfills are necessary.  Remove when that version is no longer supported.\n\n// v6\nrequire('core-js/fn/object/entries');\nrequire('core-js/fn/string/pad-start');\nrequire('core-js/fn/string/pad-end');\n\n// In Node v10, console.assert() was changed to log messages to stderr\n// *WITHOUT THROWING AN EXCEPTION*.  We should clearly have been using\n// a proper assertion library... but since we're switching to PHP anyway,\n// for the moment just hack console.assert() to make things behave the\n// way they used to.\nif (require('semver').gte(process.version, '10.0.0')) {\n\tconst oldAssert = console.assert;\n\tconsole.assert = function(value) {\n\t\tconst args = Array.from(arguments);\n\t\toldAssert.apply(console, args);\n\t\tif (!args[0]) {\n\t\t\t// We only get here in Node >= 0.10!\n\t\t\targs.shift();\n\t\t\tlet msg = 'AssertionError';\n\t\t\tif (args.length) {\n\t\t\t\tconst util = require('util');\n\t\t\t\tmsg += ': ' + util.format.apply(util, args);\n\t\t\t}\n\t\t\tclass AssertionException extends Error {\n\t\t\t\tconstructor(msg) { super(msg); this.message = msg; }\n\t\t\t}\n\t\t\tthrow new AssertionException(msg);\n\t\t}\n\t};\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/extension.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/extension/restRoutes.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/config/ParsoidConfig.js","messages":[{"ruleId":"node/no-deprecated-api","severity":2,"message":"'url.parse' was deprecated since v11.0.0. Use 'url.URL' constructor instead.","line":522,"column":20,"nodeType":"MemberExpression","endLine":522,"endColumn":29}],"errorCount":1,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/**\n * Parsoid-specific configuration.\n * This is immutable after initialization.\n *\n * @module\n */\n\n'use strict';\n\nrequire('../../core-upgrade.js');\n\nvar fs = require('fs');\nvar path = require('path');\nvar url = require('url');\nvar ServiceRunner = require('service-runner');\n\nvar Util = require('../utils/Util.js').Util;\nvar JSUtils = require('../utils/jsutils.js').JSUtils;\nvar wmfSiteMatrix = require('./wmf.sitematrix.json').sitematrix;\n\n/*\n * @property {Object} CONFIG_DEFAULTS\n *   Timeout values for various things. All values in ms.\n * @private\n */\nvar CONFIG_DEFAULTS = Object.freeze({\n\ttimeouts: {\n\t\t// How long does a request have to generate a response?\n\t\trequest: 4 * 60 * 1000,\n\n\t\t// These are timeouts for different api endpoints to the mediawiki API\n\t\tmwApi: {\n\t\t\t// action=expandtemplates\n\t\t\tpreprocessor: 30 * 1000,\n\t\t\t// action=parse\n\t\t\textParse: 30 * 1000,\n\t\t\t// action=templateData\n\t\t\ttemplateData: 30 * 1000,\n\t\t\t// action=parsoid-batch\n\t\t\tbatch: 60 * 1000,\n\t\t\t// action=query&prop=revisions\n\t\t\tsrcFetch: 40 * 1000,\n\t\t\t// action=query&prop=imageinfo\n\t\t\timgInfo: 40 * 1000,\n\t\t\t// action=query&meta=siteinfo\n\t\t\tconfigInfo: 40 * 1000,\n\t\t\t// action=record-lint\n\t\t\tlint: 30 * 1000,\n\t\t\t// Connection timeout setting for the http agent\n\t\t\tconnect: 5 * 1000,\n\t\t},\n\t},\n\n\t// Max concurrency level for accessing the Mediawiki API\n\tmaxSockets: 15,\n\n\t// Multiple of cpu_workers number requests to queue before rejecting\n\tmaxConcurrentCalls: 5,\n\n\tretries: {\n\t\tmwApi: {\n\t\t\tall: 1,\n\t\t\t// No retrying config requests\n\t\t\t// FIXME: but why? seems like 1 retry is not a bad idea\n\t\t\tconfigInfo: 0,\n\t\t},\n\t},\n\n\t// Somewhat arbitrary numbers for starters.\n\t// If these limits are breached, we return a http 413 (Payload too large)\n\tlimits: {\n\t\twt2html: {\n\t\t\t// We won't handle pages beyond this size\n\t\t\tmaxWikitextSize: 1000000, // 1M\n\n\t\t\t// Max list items per page\n\t\t\tmaxListItems: 30000,\n\n\t\t\t// Max table cells per page\n\t\t\tmaxTableCells: 30000,\n\n\t\t\t// Max transclusions per page\n\t\t\tmaxTransclusions: 10000,\n\n\t\t\t// DISABLED for now\n\t\t\t// Max images per page\n\t\t\tmaxImages: 1000,\n\n\t\t\t// Max top-level token size\n\t\t\tmaxTokens: 1000000, // 1M\n\t\t},\n\t\thtml2wt: {\n\t\t\t// We refuse to serialize HTML strings bigger than this\n\t\t\tmaxHTMLSize: 10000000,  // 10M\n\t\t},\n\t},\n\n\tlinter: {\n\t\t// Whether to send lint errors to the MW API\n\t\t// Requires the MW Linter extension to be installed and configured.\n\t\tsendAPI: false,\n\n\t\t// Ratio at which to sample linter errors, per page.\n\t\t// This is deterministic and based on page_id.\n\t\tapiSampling: 1,\n\n\t\t// Max length of content covered by 'white-space:nowrap' CSS\n\t\t// that we consider \"safe\" when Tidy is replaced. Beyond that,\n\t\t// wikitext will have to be fixed up to manually insert whitespace\n\t\t// at the right places.\n\t\ttidyWhitespaceBugMaxLength: 100,\n\t},\n});\n\nvar prepareLog = function(logData) {\n\tvar log = Object.assign({ logType: logData.logType }, logData.locationData);\n\tvar flat = logData.flatLogObject();\n\tObject.keys(flat).forEach(function(k) {\n\t\t// Be sure we don't have a `type` field here since logstash\n\t\t// treats that as magical.  We created a special `location`\n\t\t// field above and bunyan will add a `level` field (from the\n\t\t// contents of our `type` field) when we call the specific\n\t\t// logger returned by `_getBunyanLogger`.\n\t\tif (/^(type|location|level)$/.test(k)) { return; }\n\t\tlog[k] = flat[k];\n\t});\n\treturn log;\n};\n\n/**\n * Global Parsoid configuration object. Will hold things like debug/trace\n * options, mw api map, and local settings like fetchTemplates.\n *\n * @class\n * @param {Object} localSettings The localSettings object, probably from a localsettings.js file.\n * @param {Function} localSettings.setup The local settings setup function, which sets up our local configuration.\n * @param {ParsoidConfig} localSettings.setup.opts The setup function is passed the object under construction so it can extend the config directly.\n * @param {Object} options Any options we want to set over the defaults. See the class properties for more information.\n */\nfunction ParsoidConfig(localSettings, options) {\n\toptions = options || {};\n\n\tthis.mwApiMap = new Map();\n\tthis.reverseMwApiMap = new Map();\n\tObject.keys(CONFIG_DEFAULTS).forEach(function(k) {\n\t\tthis[k] = Util.clone(CONFIG_DEFAULTS[k]);\n\t}, this);\n\tthis._uniq = 0;\n\n\t// Don't freak out!\n\t// This happily overwrites inherited properties.\n\tObject.assign(this, options);\n\t// Trace, debug, and dump flags should be sets, but options might\n\t// include them as arrays.\n\t['traceFlags', 'debugFlags', 'dumpFlags'].forEach(function(f) {\n\t\tif (Array.isArray(this[f])) {\n\t\t\tthis[f] = new Set(this[f]);\n\t\t}\n\t}, this);\n\n\tif (options.parent && (!this.loggerBackend || !this.metrics)) {\n\t\tvar srlogger = ServiceRunner.getLogger(options.parent.logging);\n\t\tif (!this.loggerBackend) {\n\t\t\tthis.loggerBackend = function(logData, cb) {\n\t\t\t\tsrlogger.log(logData.logType, prepareLog(logData));\n\t\t\t\tcb();\n\t\t\t};\n\t\t}\n\t\tif (!this.metrics) {\n\t\t\tthis.metrics = ServiceRunner.getMetrics(options.parent.metrics, srlogger);\n\t\t}\n\t}\n\n\tif (!localSettings && options.localsettings) {\n\t\tlocalSettings = require(options.localsettings);\n\t}\n\n\tif (localSettings && localSettings.setup) {\n\t\tlocalSettings.setup(this);\n\t}\n\n\t// Call setMwApi for each specified API.\n\tif (Array.isArray(this.mwApis)) {\n\t\tthis.mwApis.forEach(function(api) {\n\t\t\tthis.setMwApi(api);\n\t\t}, this);\n\t}\n\n\tif (this.loadWMF) {\n\t\tthis.loadWMFApiMap();\n\t}\n\n\t// Make sure all critical required properties are present\n\tthis._sanitizeIt();\n\n\t// ParsoidConfig is used across requests. Freeze it to avoid mutation.\n\tvar ignoreFields = {\n\t\tmetrics: true,\n\t\tloggerBackend: true,\n\t\tmwApiMap: true,\n\t\treverseMwApiMap: true\n\t};\n\tJSUtils.deepFreezeButIgnore(this, ignoreFields);\n}\n\n\n/**\n * @property {boolean} debug Whether to print debugging information.\n */\nParsoidConfig.prototype.debug = false;\n\n/**\n * @property {Set} traceFlags Flags that tell us which tracing information to print.\n */\nParsoidConfig.prototype.traceFlags = null;\n\n/**\n * @property {Set} debugFlags Flags that tell us which debugging information to print.\n */\nParsoidConfig.prototype.debugFlags = null;\n\n/**\n * @property {Set} dumpFlags Flags that tell us what state to dump.\n */\nParsoidConfig.prototype.dumpFlags = null;\n\n/**\n * @property {boolean} fetchTemplates Whether we should request templates from a wiki, or just use cached versions.\n */\nParsoidConfig.prototype.fetchTemplates = true;\n\n/**\n * @property {boolean} expandExtensions Whether we should request extension tag expansions from a wiki.\n */\nParsoidConfig.prototype.expandExtensions = true;\n\n/**\n * @property {number} maxDepth The maximum depth to which we should expand templates. Only applies if we would fetch templates anyway, and if we're actually expanding templates. So #fetchTemplates must be true and #usePHPPreProcessor must be false.\n */\nParsoidConfig.prototype.maxDepth = 40;\n\n/**\n * @property {boolean} usePHPPreProcessor Whether we should use the PHP Preprocessor to expand templates, extension content, and the like. See #PHPPreProcessorRequest in lib/mediawiki.ApiRequest.js\n */\nParsoidConfig.prototype.usePHPPreProcessor = true;\n\n/**\n * @property {string} defaultWiki The wiki we should use for template, page, and configuration requests. We set this as a default because a configuration file (e.g. the API service's config.yaml) might set this, but we will still use the appropriate wiki when requests come in for a different prefix.\n */\nParsoidConfig.prototype.defaultWiki = 'enwiki';\n\n/**\n * @property {string} allowCORS Permissive CORS headers as Parsoid is full idempotent currently\n */\nParsoidConfig.prototype.allowCORS = '*';\n\n/**\n * @property {boolean} useSelser Whether to use selective serialization when serializing a DOM to Wikitext. This amounts to not serializing bits of the page that aren't marked as having changed, and requires some way of getting the original text of the page. See #SelectiveSerializer in lib/mediawiki.SelectiveSerializer.js\n */\nParsoidConfig.prototype.useSelser = false;\n\n/**\n * @property {boolean} fetchConfig\n *   Whether to fetch the wiki config from the server or use our local copy.\n */\nParsoidConfig.prototype.fetchConfig = true;\n\n/**\n * @property {boolean} fetchImageInfo\n *   Whether to fetch image info via the API or else treat all images as missing.\n */\nParsoidConfig.prototype.fetchImageInfo = true;\n\n/**\n * @property {boolean} rtTestMode\n *   Test in rt test mode (changes some parse & serialization strategies)\n */\nParsoidConfig.prototype.rtTestMode = false;\n\n/**\n * @property {boolean} addHTMLTemplateParameters\n * When processing template parameters, parse them to HTML and add it to the\n * template parameters data.\n */\nParsoidConfig.prototype.addHTMLTemplateParameters = false;\n\n/**\n * @property {boolean|Array} linting Whether to enable linter Backend.\n * Or an array of enabled lint types\n */\nParsoidConfig.prototype.linting = false;\n\n/**\n * @property {Function} loggerBackend\n * The logger output function.\n * By default, use stderr to output logs.\n */\nParsoidConfig.prototype.loggerBackend = null;\n\n/**\n * @property {Array|null} loggerSampling\n * An array of arrays of log types and sample rates, in percent.\n * Omissions imply 100.\n * For example,\n *   parsoidConfig.loggerSampling = [\n *     ['warn/dsr/inconsistent', 1],\n *   ];\n */\nParsoidConfig.prototype.loggerSampling = null;\n\n/**\n * @property {Function} tracerBackend\n * The tracer output function.\n * By default, use stderr to output traces.\n */\nParsoidConfig.prototype.tracerBackend = null;\n\n/**\n * @property {boolean} strictSSL\n * By default require SSL certificates to be valid\n * Set to false when using self-signed SSL certificates\n */\nParsoidConfig.prototype.strictSSL = true;\n\n/**\n * The default api proxy, overridden by apiConf.proxy entries.\n */\nParsoidConfig.prototype.defaultAPIProxyURI = undefined;\n\n/**\n * Server to connect to for MediaWiki API requests.\n */\nParsoidConfig.prototype.mwApiServer = undefined;\n\n/**\n * The server from which to load style modules.\n */\nParsoidConfig.prototype.modulesLoadURI = undefined;\n\n/**\n * Load WMF sites in the interwikiMap from the cached wmf.sitematrix.json\n */\nParsoidConfig.prototype.loadWMF = false;\n\n/**\n * Set to true to use the Parsoid-specific batch API from the ParsoidBatchAPI\n * extension (action=parsoid-batch).\n */\nParsoidConfig.prototype.useBatchAPI = false;\n\n/**\n * The batch size for parse/preprocess requests\n */\nParsoidConfig.prototype.batchSize = 50;\n\n/**\n * The maximum number of concurrent requests that the API request batcher will\n * allow to be active at any given time. Before this limit is reached, requests\n * will be dispatched more aggressively, giving smaller batches on average.\n * After the limit is reached, batches will be stored in a queue with\n * APIBatchSize items in each batch.\n */\nParsoidConfig.prototype.batchConcurrency = 4;\n\n/**\n * @property {Object|null} Statistics aggregator, for counting and timing.\n */\nParsoidConfig.prototype.metrics = null;\n\n/**\n * @property {string} Default user agent used for making Mediawiki API requests\n */\nParsoidConfig.prototype.userAgent = \"Parsoid/\" + (require('../../package.json').version);\n\n/**\n * @property {number} Number of outstanding event listeners waiting on Mediawiki API responses\n */\nParsoidConfig.prototype.maxListeners = 50000;\n\n/**\n * @property {number} Form size limit in bytes (default is 2M in express)\n */\nParsoidConfig.prototype.maxFormSize = 15 * 1024 * 1024;\n\n/**\n * Log warnings from the Mediawiki Api.\n */\nParsoidConfig.prototype.logMwApiWarnings = true;\n\n/**\n * Suppress some warnings by default.\n */\nParsoidConfig.prototype.suppressMwApiWarnings = /modulemessages is deprecated|Unrecognized parameter: variant/;\n\n/**\n * If enabled, bidi chars adjacent to category links will be stripped\n * in the html -> wt serialization pass.\n */\nParsoidConfig.prototype.scrubBidiChars = false;\n\n/**\n * @property {number} How often should we emit a heap sample? Time in ms.\n *\n * Only relevant if performance timing is enabled\n */\nParsoidConfig.prototype.heapUsageSampleInterval = 5 * 60 * 1000;\n\n/**\n * @property {Function | null} Allow dynamic configuration of unknown domains.\n *\n * See T100841.\n */\nParsoidConfig.prototype.dynamicConfig = null;\n\n/**\n * Initialize the mwApiMap and friends.\n */\nParsoidConfig.prototype.loadWMFApiMap = function() {\n\tvar insertInMaps = (site) => {\n\t\t// Don't use the default proxy for restricted sites.\n\t\t// private: Restricted read and write access.\n\t\t// fishbowl: Restricted write access, full read access.\n\t\t// closed: No write access.\n\t\t// nonglobal: Public but requires registration.\n\t\tconst restricted = site.hasOwnProperty(\"private\") ||\n\t\t\tsite.hasOwnProperty(\"fishbowl\") ||\n\t\t\tsite.hasOwnProperty(\"nonglobal\");\n\n\t\t// Avoid overwriting those already set in localsettings setup.\n\t\tif (!this.mwApiMap.has(site.dbname)) {\n\t\t\tvar apiConf = {\n\t\t\t\tprefix: site.dbname,\n\t\t\t\turi: site.url + \"/w/api.php\",\n\t\t\t\tproxy: {\n\t\t\t\t\turi: restricted ? null : undefined,\n\t\t\t\t\t// WMF production servers don't listen on port 443.\n\t\t\t\t\t// see mediawiki.ApiRequest for handling of this option.\n\t\t\t\t\tstrip_https: true,\n\t\t\t\t},\n\t\t\t\tnonglobal: site.hasOwnProperty(\"nonglobal\"),\n\t\t\t\trestricted,\n\t\t\t};\n\t\t\tthis.setMwApi(apiConf);\n\t\t}\n\t};\n\n\t// See getAPIProxy for the meaning of null / undefined in setMwApi.\n\n\tObject.keys(wmfSiteMatrix).forEach((key) => {\n\t\tvar val = wmfSiteMatrix[key];\n\t\tif (!Number.isNaN(Number(key))) {\n\t\t\tval.site.forEach(insertInMaps);\n\t\t} else if (key === \"specials\") {\n\t\t\tval.forEach(insertInMaps);\n\t\t}\n\t});\n};\n\n/**\n * Set up a wiki configuration.\n *\n * For backward compatibility, if there are two arguments the first is\n * taken as a prefix and the second as the configuration, and if\n * the configuration is a string it is used as the `uri` property\n * in a new empty configuration object.  This usage is deprecated;\n * we recommend users pass a configuration object as documented below.\n *\n * @param {Object} apiConf\n *   The wiki configuration object.\n * @param {string} apiConf.uri\n *   The URL to the wiki's Action API (`api.php`).\n *   This is the only mandatory argument.\n * @param {string} [apiConf.domain]\n *   The \"domain\" used to identify this wiki when using the Parsoid v2/v3 API.\n *   It defaults to the hostname portion of `apiConf.uri`.\n * @param {string} [apiConf.prefix]\n *   An arbitrary unique identifier for this wiki.  If none is provided\n *   a unique string will be generated.\n * @param {Object} [apiConf.proxy]\n *   A proxy configuration object.\n * @param {string|null} [apiConf.proxy.uri]\n *   The URL of a proxy to use for API requests, or null to explicitly\n *   disable API request proxying for this wiki. Will fall back to\n *   {@link ParsoidConfig#defaultAPIProxyURI} if `undefined` (default value).\n * @param {Object} [apiConf.proxy.headers]\n *   Headers to add when proxying.\n * @param {Array} [apiConf.extensions]\n *   A list of native extension constructors.  Otherwise, registers cite by\n *   default.\n * @param {boolean} [apiConf.strictSSL]\n */\nParsoidConfig.prototype.setMwApi = function(apiConf) {\n\tvar prefix;\n\t// Backward-compatibility with old calling conventions.\n\tif (typeof arguments[0] === 'string') {\n\t\tconsole.warn(\n\t\t\t'String arguments to ParsoidConfig#setMwApi are deprecated:',\n\t\t\targuments[0]\n\t\t);\n\t\tif (typeof arguments[1] === 'string') {\n\t\t\tapiConf = { prefix: arguments[0], uri: arguments[1] };\n\t\t} else if (typeof arguments[1] === 'object') {\n\t\t\t// Note that `apiConf` is aliased to `arguments[0]`.\n\t\t\tprefix = arguments[0];\n\t\t\tapiConf = Object.assign({}, arguments[1]);  // overwrites `arguments[0]`\n\t\t\tapiConf.prefix = prefix;\n\t\t} else {\n\t\t\tapiConf = { uri: arguments[0] };\n\t\t}\n\t} else {\n\t\tconsole.assert(typeof apiConf === 'object');\n\t\tapiConf = Object.assign({}, apiConf);  // Don't modify the passed in object\n\t}\n\tconsole.assert(apiConf.uri, \"Action API uri is mandatory.\");\n\tif (!apiConf.prefix) {\n\t\t// Pick a unique prefix.\n\t\tdo {\n\t\t\tapiConf.prefix = 'wiki$' + (this._uniq++);\n\t\t} while (this.mwApiMap.has(apiConf.prefix));\n\t}\n\tif (!apiConf.domain) {\n\t\tapiConf.domain = url.parse(apiConf.uri).host;\n\t}\n\tprefix = apiConf.prefix;\n\n\t// Give them some default extensions.\n\tif (!Array.isArray(apiConf.extensions)) {\n\t\t// Native support for certain extensions (Cite, etc)\n\t\t// Note that in order to remain compatible with mediawiki core,\n\t\t// core extensions (for example, for the JSON content model)\n\t\t// must take precedence over other extensions.\n\t\tapiConf.extensions = Util.clone(this.defaultNativeExtensions);\n\t\t/* Include global user extensions */\n\t\tParsoidConfig._collectExtensions(\n\t\t\tapiConf.extensions\n\t\t);\n\t\t/* Include wiki-specific user extensions */\n\t\t// User can specify an alternate directory here, so they can point\n\t\t// directly at their mediawiki core install if they wish.\n\t\tParsoidConfig._collectExtensions(\n\t\t\tapiConf.extensions, apiConf.extdir || apiConf.domain\n\t\t);\n\t}\n\n\tif (this.reverseMwApiMap.has(apiConf.domain)) {\n\t\tconsole.warn(\n\t\t\t\"Domain should be unique in ParsoidConfig#setMwApi calls:\",\n\t\t\tapiConf.domain\n\t\t);\n\t\tconsole.warn(\n\t\t\t\"(It doesn't have to be an actual domain, just a unique string.)\"\n\t\t);\n\t}\n\tif (this.mwApiMap.has(prefix)) {\n\t\tconsole.warn(\n\t\t\t\"Prefix should be unique in ParsoidConfig#setMwApi calls:\",\n\t\t\tprefix\n\t\t);\n\t\tthis.reverseMwApiMap.delete(this.mwApiMap.get(prefix).domain);\n\t}\n\tthis.mwApiMap.set(prefix, apiConf);\n\tthis.reverseMwApiMap.set(apiConf.domain, prefix);\n};\n\n/**\n * Remove an wiki configuration.\n *\n * @param {Object} apiConf\n *   A wiki configuration object.  The value of `apiConf.domain`, or if\n *   that is missing `apiConf.prefix`, will be used to locate the\n *   configuration to remove.  Deprecated: if a string is passed, it\n *   is used as the prefix to remove.\n */\nParsoidConfig.prototype.removeMwApi = function(apiConf) {\n\tvar prefix, domain;\n\tif (typeof apiConf === 'string') {\n\t\tconsole.warn(\n\t\t\t\"Passing a string to ParsoidConfig#removeMwApi is deprecated:\",\n\t\t\tapiConf\n\t\t);\n\t\tapiConf = { prefix: apiConf };\n\t}\n\tprefix = apiConf.prefix;\n\tdomain = apiConf.domain;\n\tconsole.assert(prefix || domain, \"Must pass either prefix or domain\");\n\tif (domain) {\n\t\tprefix = this.reverseMwApiMap.get(domain);\n\t}\n\tif (!prefix || !this.mwApiMap.has(prefix)) {\n\t\treturn;\n\t}\n\tif (!domain) {\n\t\tdomain = this.mwApiMap.get(prefix).domain;\n\t}\n\tthis.reverseMwApiMap.delete(domain);\n\tthis.mwApiMap.delete(prefix);\n};\n\n/**\n * Return the internal prefix used to index configuration information for\n * the given domain string.  If the prefix is not present, attempts\n * dynamic configuration using the `dynamicConfig` hook before returning.\n *\n * XXX: We should eventually move the dynamic configuration to lookups on\n * the mwApiMap, once we remove `prefix` from our codebase: T206764.\n *\n * @param {string} domain\n * @return {string} Internal prefix\n */\nParsoidConfig.prototype.getPrefixFor = function(domain) {\n\t// Support dynamic configuration\n\tif (!this.reverseMwApiMap.has(domain) && this.dynamicConfig) {\n\t\tthis.dynamicConfig(domain);\n\t}\n\treturn this.reverseMwApiMap.get(domain);\n};\n\n/**\n * Figure out the proxy to use for API requests for a given wiki.\n *\n * @param {string} prefix\n * @return {Object}\n */\nParsoidConfig.prototype.getAPIProxy = function(prefix) {\n\tvar apiProxy = { uri: undefined, headers: undefined };\n\t// Don't update the stored proxy object, otherwise subsequent calls\n\t// with the same prefix may do the wrong thing. (ex. null -> undefined ->\n\t// defaultAPIProxyURI)\n\tObject.assign(apiProxy, this.mwApiMap.get(prefix).proxy);\n\tif (apiProxy.uri === null ||\n\t\tthis.mwApiMap.get(prefix).proxy === null) {\n\t\t// Explicitly disable the proxy if null was set for this prefix\n\t\tapiProxy.uri = undefined;\n\t} else if (apiProxy.uri === undefined) {\n\t\t// No specific api proxy set. Fall back to generic API proxy.\n\t\tapiProxy.uri = this.defaultAPIProxyURI;\n\t}\n\treturn apiProxy;\n};\n\n// Collect extensions from a directory.\nParsoidConfig._collectExtensions = function(arr, dir, isNative) {\n\tvar base = path.join(__dirname, '..', '..', 'extensions');\n\tif (dir) { base = path.resolve(base, dir); }\n\ttry {\n\t\tif (!fs.statSync(base).isDirectory()) { return; /* not dir */ }\n\t} catch (e) { return; /* no file there */ }\n\tvar files = fs.readdirSync(base);\n\t// Sort! To ensure that we have a repeatable order in which we load\n\t// and process extensions.\n\tfiles.sort();\n\tfiles.forEach(function(d) {\n\t\tvar p = isNative ? path.join(base, d) : path.join(base, d, 'parsoid');\n\t\ttry {\n\t\t\tif (!fs.statSync(p).isDirectory()) { return; /* not dir */ }\n\t\t} catch (e) { return; /* no file there */ }\n\t\t// Make sure that exceptions here are visible to user.\n\t\tarr.push(ParsoidConfig.loadExtension(p));\n\t});\n};\n\nParsoidConfig.loadExtension = function(modulePath) {\n\t// The extension will load the extension API relative to this module.\n\tvar ext = require(modulePath);\n\tconsole.assert(\n\t\ttypeof ext === 'function',\n\t\t\"Extension is not a function when loading \" + modulePath\n\t);\n\treturn ext;\n};\n\n// Useful internal function for testing\nParsoidConfig.prototype._sanitizeIt = function() {\n\tthis.sanitizeConfig(this, CONFIG_DEFAULTS);\n};\n\nParsoidConfig.prototype.sanitizeConfig = function(obj, defaults) {\n\t// Make sure that all critical required values are set and\n\t// that localsettings.js mistakes don't leave holes in the settings.\n\t//\n\t// Ex: parsoidConfig.timeouts = {}\n\n\tObject.keys(defaults).forEach((key) => {\n\t\tif (obj[key] === null || obj[key] === undefined || typeof obj[key] !== typeof defaults[key]) {\n\t\t\tif (obj[key] !== undefined) {\n\t\t\t\tconsole.warn(\"WARNING: For config property \" + key + \", required a value of type: \" + (typeof defaults[key]));\n\t\t\t\tconsole.warn(\"Found \" + JSON.stringify(obj[key]) + \"; Resetting it to: \" + JSON.stringify(defaults[key]));\n\t\t\t}\n\t\t\tobj[key] = Util.clone(defaults[key]);\n\t\t} else if (typeof defaults[key] === 'object') {\n\t\t\tthis.sanitizeConfig(obj[key], defaults[key]);\n\t\t}\n\t});\n};\n\nParsoidConfig.prototype.defaultNativeExtensions = [];\nParsoidConfig._collectExtensions(\n\tParsoidConfig.prototype.defaultNativeExtensions,\n\tpath.resolve(__dirname, '../ext'),\n\ttrue /* don't require a 'parsoid' subdirectory */\n);\n\n/**\n * @property {boolean} Expose development routes in the HTTP API.\n */\nParsoidConfig.prototype.devAPI = false;\n\n/**\n * @property {boolean} Enable editing galleries via HTML, instead of extsrc.\n */\nParsoidConfig.prototype.nativeGallery = true;\n\nif (typeof module === \"object\") {\n\tmodule.exports.ParsoidConfig = ParsoidConfig;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/config/WikitextConstants.js","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/config/wmf.sitematrix.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/html2wt/DOMNormalizer.js","messages":[{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":53,"column":1,"nodeType":"Block","endLine":60,"endColumn":4},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"a\" type.","line":56,"column":null,"nodeType":"Block","endLine":56,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"b\" type.","line":57,"column":null,"nodeType":"Block","endLine":57,"endColumn":null},{"ruleId":"jsdoc/check-param-names","severity":1,"message":"Duplicate @param \"a\"","line":58,"column":null,"nodeType":"Block","endLine":58,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"a\" type.","line":58,"column":null,"nodeType":"Block","endLine":58,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"b\" type.","line":59,"column":null,"nodeType":"Block","endLine":59,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":65,"column":1,"nodeType":"Block","endLine":75,"endColumn":4},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"a\" type.","line":71,"column":null,"nodeType":"Block","endLine":71,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"b\" type.","line":72,"column":null,"nodeType":"Block","endLine":72,"endColumn":null},{"ruleId":"jsdoc/check-param-names","severity":1,"message":"Duplicate @param \"a\"","line":73,"column":null,"nodeType":"Block","endLine":73,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"a\" type.","line":73,"column":null,"nodeType":"Block","endLine":73,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"b\" type.","line":74,"column":null,"nodeType":"Block","endLine":74,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'SerializerState' is undefined.","line":145,"column":null,"nodeType":"Block","endLine":145,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":191,"column":2,"nodeType":"Block","endLine":198,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"a\" type.","line":194,"column":null,"nodeType":"Block","endLine":194,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"b\" type.","line":195,"column":null,"nodeType":"Block","endLine":195,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"a\" type.","line":196,"column":null,"nodeType":"Block","endLine":196,"endColumn":null},{"ruleId":"jsdoc/check-param-names","severity":1,"message":"Duplicate @param \"a\"","line":196,"column":null,"nodeType":"Block","endLine":196,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"b\" type.","line":197,"column":null,"nodeType":"Block","endLine":197,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":237,"column":2,"nodeType":"Block","endLine":242,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"a\" type.","line":240,"column":null,"nodeType":"Block","endLine":240,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"b\" type.","line":241,"column":null,"nodeType":"Block","endLine":241,"endColumn":null},{"ruleId":"no-shadow","severity":2,"message":"'firstChild' is already declared in the upper scope.","line":413,"column":7,"nodeType":"Identifier","messageId":"noShadow","endLine":413,"endColumn":17},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":467,"column":null,"nodeType":"Block","endLine":467,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":468,"column":null,"nodeType":"Block","endLine":468,"endColumn":null}],"errorCount":1,"warningCount":24,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/**\n * DOM normalization.\n *\n * DOM normalizations are performed after DOMDiff is run.\n * So, normalization routines should update diff markers appropriately.\n *\n * SSS FIXME: Once we simplify WTS to get rid of rt-test mode,\n * we should be able to get rid of the 'children-changed' diff marker\n * and just use the more generic 'subtree-changed' marker.\n *\n * @module\n */\n\n'use strict';\n\nrequire('../../core-upgrade.js');\n\nconst { WikitextConstants: Consts } = require('../config/WikitextConstants.js');\nconst { ContentUtils } = require('../utils/ContentUtils.js');\nconst { DiffUtils } = require('./DiffUtils.js');\nconst { DOMDataUtils } = require('../utils/DOMDataUtils.js');\nconst { DOMUtils } = require('../utils/DOMUtils.js');\nconst { JSUtils } = require('../utils/jsutils.js');\nconst { WTSUtils } = require('./WTSUtils.js');\nconst { WTUtils } = require('../utils/WTUtils.js');\n\nconst wtIgnorableAttrs = new Set(['data-parsoid', 'id', 'title', DOMDataUtils.DataObjectAttrName()]);\nconst htmlIgnorableAttrs = new Set(['data-parsoid', DOMDataUtils.DataObjectAttrName()]);\nconst specializedAttribHandlers = JSUtils.mapObject({\n\t'data-mw': function(nodeA, dmwA, nodeB, dmwB, options) {\n\t\treturn JSUtils.deepEquals(dmwA, dmwB);\n\t},\n});\n\nfunction similar(a, b) {\n\tif (a.nodeName === 'A') {\n\t\t// FIXME: Similar to 1ce6a98, DOMUtils.nextNonDeletedSibling is being\n\t\t// used in this file where maybe DOMUtils.nextNonSepSibling belongs.\n\t\treturn DOMUtils.isElt(b) && DiffUtils.attribsEquals(a, b, wtIgnorableAttrs, specializedAttribHandlers);\n\t} else {\n\t\tvar aIsHtml = WTUtils.isLiteralHTMLNode(a);\n\t\tvar bIsHtml = WTUtils.isLiteralHTMLNode(b);\n\t\tvar ignorableAttrs = aIsHtml ? htmlIgnorableAttrs : wtIgnorableAttrs;\n\n\t\t// FIXME: For non-HTML I/B tags, we seem to be dropping all attributes\n\t\t// in our tag handlers (which seems like a bug). Till that is fixed,\n\t\t// we'll preserve existing functionality here.\n\t\treturn (!aIsHtml && !bIsHtml) ||\n\t\t\t(aIsHtml && bIsHtml && DiffUtils.attribsEquals(a, b, ignorableAttrs, specializedAttribHandlers));\n\t}\n}\n\n/**\n * Can a and b be merged into a single node?\n *\n * @param a\n * @param b\n * @param a\n * @param b\n */\nfunction mergable(a, b) {\n\treturn a.nodeName === b.nodeName && similar(a, b);\n}\n\n/**\n * Can a and b be combined into a single node\n * if we swap a and a.firstChild?\n *\n * For example: A='<b><i>x</i></b>' b='<i>y</i>' => '<i><b>x</b>y</i>'.\n *\n * @param a\n * @param b\n * @param a\n * @param b\n */\nfunction swappable(a, b) {\n\treturn DOMUtils.numNonDeletedChildNodes(a) === 1 &&\n\t\tsimilar(a, DOMUtils.firstNonDeletedChild(a)) &&\n\t\tmergable(DOMUtils.firstNonDeletedChild(a), b);\n}\n\nfunction firstChild(node, rtl) {\n\treturn rtl ? DOMUtils.lastNonDeletedChild(node) : DOMUtils.firstNonDeletedChild(node);\n}\n\nfunction isInsertedContent(node, env) {\n\twhile (true) {\n\t\tif (DiffUtils.hasInsertedDiffMark(node, env)) {\n\t\t\treturn true;\n\t\t}\n\t\tif (DOMUtils.isBody(node)) {\n\t\t\treturn false;\n\t\t}\n\t\tnode = node.parentNode;\n\t}\n}\n\n/*\n * Tag minimization\n * ----------------\n * Minimize a pair of tags in the dom tree rooted at node.\n *\n * This function merges adjacent nodes of the same type\n * and swaps nodes where possible to enable further merging.\n *\n * See examples below:\n *\n * 1. <b>X</b><b>Y</b>\n *    ==> <b>XY</b>\n *\n * 2. <i>A</i><b><i>X</i></b><b><i>Y</i></b><i>Z</i>\n *    ==> <i>A<b>XY</b>Z</i>\n *\n * 3. <a href=\"Football\">Foot</a><a href=\"Football\">ball</a>\n *    ==> <a href=\"Football\">Football</a>\n */\n\nfunction rewriteablePair(env, a, b) {\n\tif (Consts.WTQuoteTags.has(a.nodeName)) {\n\t\t// For <i>/<b> pair, we need not check whether the node being transformed\n\t\t// are new / edited, etc. since these minimization scenarios can\n\t\t// never show up in HTML that came from parsed wikitext.\n\t\t//\n\t\t// <i>..</i><i>..</i> can never show up without a <nowiki/> in between.\n\t\t// Similarly for <b>..</b><b>..</b> and <b><i>..</i></b><i>..</i>.\n\t\t//\n\t\t// This is because a sequence of 4 quotes is not parsed as ..</i><i>..\n\t\t// Neither is a sequence of 7 quotes parsed as ..</i></b><i>..\n\t\t//\n\t\t// So, if we see a minimizable pair of nodes, it is because the HTML\n\t\t// didn't originate from wikitext OR the HTML has been subsequently edited.\n\t\t// In both cases, we want to transform the DOM.\n\n\t\treturn Consts.WTQuoteTags.has(b.nodeName);\n\t} else if (env.scrubWikitext && a.nodeName === 'A') {\n\t\t// Link merging is only supported in scrubWikitext mode.\n\t\t// For <a> tags, we require at least one of the two tags\n\t\t// to be a newly created element.\n\t\treturn b.nodeName === 'A' && (WTUtils.isNewElt(a) || WTUtils.isNewElt(b));\n\t}\n}\n\n/**\n * @class\n * @param {SerializerState} state\n */\nclass DOMNormalizer {\n\tconstructor(state) {\n\t\tthis.env = state.env;\n\t\tthis.inSelserMode = state.selserMode;\n\t\tthis.inRtTestMode = state.rtTestMode;\n\t\tthis.inInsertedContent = false;\n\t}\n\n\taddDiffMarks(node, mark, dontRecurse) {\n\t\tvar env = this.env;\n\t\tif (!this.inSelserMode || DiffUtils.hasDiffMark(node, env, mark)) {\n\t\t\treturn;\n\t\t}\n\n\t\t// Don't introduce nested inserted markers\n\t\tif (this.inInsertedContent && mark === 'inserted') {\n\t\t\treturn;\n\t\t}\n\n\t\t// Newly added elements don't need diff marks\n\t\tif (!WTUtils.isNewElt(node)) {\n\t\t\tDiffUtils.addDiffMark(node, env, mark);\n\t\t\tif (mark === 'inserted' || mark === 'deleted') {\n\t\t\t\tDiffUtils.addDiffMark(node.parentNode, env, 'children-changed');\n\t\t\t}\n\t\t}\n\n\t\tif (dontRecurse) {\n\t\t\treturn;\n\t\t}\n\n\t\t// Walk up the subtree and add 'subtree-changed' markers\n\t\tnode = node.parentNode;\n\t\twhile (DOMUtils.isElt(node) && !DOMUtils.isBody(node)) {\n\t\t\tif (DiffUtils.hasDiffMark(node, env, 'subtree-changed')) {\n\t\t\t\treturn;\n\t\t\t}\n\t\t\tif (!WTUtils.isNewElt(node)) {\n\t\t\t\tDiffUtils.setDiffMark(node, env, 'subtree-changed');\n\t\t\t}\n\t\t\tnode = node.parentNode;\n\t\t}\n\t}\n\n\t/**\n\t * Transfer all of b's children to a and delete b.\n\t *\n\t * @param a\n\t * @param b\n\t * @param a\n\t * @param b\n\t */\n\tmerge(a, b) {\n\t\tvar sentinel = b.firstChild;\n\n\t\t// Migrate any intermediate nodes (usually 0 / 1 diff markers)\n\t\t// present between a and b to a\n\t\tvar next = a.nextSibling;\n\t\tif (next !== b) {\n\t\t\ta.appendChild(next);\n\t\t}\n\n\t\t// The real work of merging\n\t\tDOMUtils.migrateChildren(b, a);\n\t\tb.parentNode.removeChild(b);\n\n\t\t// Normalize the node to merge any adjacent text nodes\n\t\ta.normalize();\n\n\t\t// Update diff markers\n\t\tif (sentinel) {\n\t\t\t// Nodes starting at 'sentinal' were inserted into 'a'\n\t\t\t// b, which was a's sibling was deleted\n\t\t\t// Only addDiffMarks to sentinel, if it is still part of the dom\n\t\t\t// (and hasn't been deleted by the call to a.normalize() )\n\t\t\tif (sentinel.parentNode) {\n\t\t\t\tthis.addDiffMarks(sentinel, 'moved', true);\n\t\t\t}\n\t\t\tthis.addDiffMarks(a, 'children-changed', true);\n\t\t}\n\t\tif (a.nextSibling) {\n\t\t\t// FIXME: Hmm .. there is an API hole here\n\t\t\t// about ability to add markers after last child\n\t\t\tthis.addDiffMarks(a.nextSibling, 'moved', true);\n\t\t}\n\t\tthis.addDiffMarks(a.parentNode, 'children-changed');\n\n\t\treturn a;\n\t}\n\n\t/**\n\t * b is a's sole non-deleted child.  Switch them around.\n\t *\n\t * @param a\n\t * @param b\n\t */\n\tswap(a, b) {\n\t\tDOMUtils.migrateChildren(b, a);\n\t\ta.parentNode.insertBefore(b, a);\n\t\tb.appendChild(a);\n\n\t\t// Mark a's subtree, a, and b as all having moved\n\t\tif (a.firstChild !== null) {\n\t\t\tthis.addDiffMarks(a.firstChild, 'moved', true);\n\t\t}\n\t\tthis.addDiffMarks(a, 'moved', true);\n\t\tthis.addDiffMarks(b, 'moved', true);\n\t\tthis.addDiffMarks(a, 'children-changed', true);\n\t\tthis.addDiffMarks(b, 'children-changed', true);\n\t\tthis.addDiffMarks(b.parentNode, 'children-changed');\n\n\t\treturn b;\n\t}\n\n\thoistLinks(node, rtl) {\n\t\tvar sibling = firstChild(node, rtl);\n\t\tvar hasHoistableContent = false;\n\n\t\twhile (sibling) {\n\t\t\tvar next = rtl ? DOMUtils.previousNonDeletedSibling(sibling) : DOMUtils.nextNonDeletedSibling(sibling);\n\t\t\tif (!DOMUtils.isContentNode(sibling)) {\n\t\t\t\tsibling = next;\n\t\t\t\tcontinue;\n\t\t\t} else if (!WTUtils.isRenderingTransparentNode(sibling)\n\t\t\t\t|| WTUtils.isEncapsulationWrapper(sibling)) {\n\t\t\t\t// Don't venture into templated content\n\t\t\t\tbreak;\n\t\t\t} else {\n\t\t\t\thasHoistableContent = true;\n\t\t\t}\n\t\t\tsibling = next;\n\t\t}\n\n\t\tif (hasHoistableContent) {\n\t\t\t// soak up all the non-content nodes (exclude sibling)\n\t\t\tvar move = firstChild(node, rtl);\n\t\t\tvar firstNode = move;\n\t\t\twhile (move !== sibling) {\n\t\t\t\tnode.parentNode.insertBefore(move, rtl ? DOMUtils.nextNonDeletedSibling(node) : node);\n\t\t\t\tmove = firstChild(node, rtl);\n\t\t\t}\n\n\t\t\t// and drop any leading whitespace\n\t\t\tif (DOMUtils.isText(sibling)) {\n\t\t\t\tvar space = new RegExp(rtl ? '\\\\s*$' : '^\\\\s*');\n\t\t\t\tsibling.nodeValue = sibling.nodeValue.replace(space, '');\n\t\t\t}\n\n\t\t\t// Update diff markers\n\t\t\tthis.addDiffMarks(firstNode, 'moved', true);\n\t\t\tif (sibling) { this.addDiffMarks(sibling, 'moved', true); }\n\t\t\tthis.addDiffMarks(node, 'children-changed', true);\n\t\t\tthis.addDiffMarks(node.parentNode, 'children-changed');\n\t\t}\n\t}\n\n\tstripIfEmpty(node) {\n\t\tvar next = DOMUtils.nextNonDeletedSibling(node);\n\t\tvar dp = DOMDataUtils.getDataParsoid(node);\n\t\tvar strict = this.inRtTestMode;\n\t\tvar autoInserted = dp.autoInsertedStart || dp.autoInsertedEnd;\n\n\t\t// In rtTestMode, let's reduce noise by requiring the node to be fully\n\t\t// empty (ie. exclude whitespace text) and not having auto-inserted tags.\n\t\tvar strippable = !(this.inRtTestMode && autoInserted) &&\n\t\t\tDOMUtils.nodeEssentiallyEmpty(node, strict) &&\n\t\t\t// Ex: \"<a..>..</a><b></b>bar\"\n\t\t\t// From [[Foo]]<b/>bar usage found on some dewiki pages.\n\t\t\t// FIXME: Should this always than just in rt-test mode\n\t\t\t!(this.inRtTestMode && dp.stx === 'html');\n\n\t\tif (strippable) {\n\t\t\t// Update diff markers (before the deletion)\n\t\t\tthis.addDiffMarks(node, 'deleted', true);\n\t\t\tnode.parentNode.removeChild(node);\n\t\t\treturn next;\n\t\t} else {\n\t\t\treturn node;\n\t\t}\n\t}\n\n\tmoveTrailingSpacesOut(node) {\n\t\tvar next = DOMUtils.nextNonDeletedSibling(node);\n\t\tvar last = DOMUtils.lastNonDeletedChild(node);\n\t\tvar endsInSpace = DOMUtils.isText(last) && last.nodeValue.match(/\\s+$/);\n\t\t// Conditional on rtTestMode to reduce the noise in testing.\n\t\tif (!this.inRtTestMode && endsInSpace) {\n\t\t\tlast.nodeValue = last.nodeValue.substring(0, endsInSpace.index);\n\t\t\t// Try to be a little smarter and drop the spaces if possible.\n\t\t\tif (next && (!DOMUtils.isText(next) || !/^\\s+/.test(next.nodeValue))) {\n\t\t\t\tif (!DOMUtils.isText(next)) {\n\t\t\t\t\tvar txt = node.ownerDocument.createTextNode('');\n\t\t\t\t\tnode.parentNode.insertBefore(txt, next);\n\t\t\t\t\tnext = txt;\n\t\t\t\t}\n\t\t\t\tnext.nodeValue = endsInSpace[0] + next.nodeValue;\n\t\t\t\t// next (a text node) is new / had new content added to it\n\t\t\t\tthis.addDiffMarks(next, 'inserted', true);\n\t\t\t}\n\t\t\tthis.addDiffMarks(last, 'inserted', true);\n\t\t\tthis.addDiffMarks(node.parentNode, 'children-changed');\n\t\t}\n\t}\n\n\tstripBRs(node) {\n\t\tvar child = node.firstChild;\n\t\twhile (child) {\n\t\t\tvar next = child.nextSibling;\n\t\t\tif (child.nodeName === 'BR') {\n\t\t\t\t// replace <br/> with a single space\n\t\t\t\tnode.removeChild(child);\n\t\t\t\tnode.insertBefore(node.ownerDocument.createTextNode(' '), next);\n\t\t\t} else if (DOMUtils.isElt(child)) {\n\t\t\t\tthis.stripBRs(child);\n\t\t\t}\n\t\t\tchild = next;\n\t\t}\n\t}\n\n\tstripBidiCharsAroundCategories(node) {\n\t\tif (!DOMUtils.isText(node) ||\n\t\t\t(!WTUtils.isCategoryLink(node.previousSibling) && !WTUtils.isCategoryLink(node.nextSibling))) {\n\t\t\t// Not a text node and not adjacent to a category link\n\t\t\treturn node;\n\t\t}\n\n\t\tvar next = node.nextSibling;\n\t\tif (!next || WTUtils.isCategoryLink(next)) {\n\t\t\t// The following can leave behind an empty text node.\n\t\t\tvar oldLength = node.nodeValue.length;\n\t\t\tnode.nodeValue = node.nodeValue.replace(/([\\u200e\\u200f]+\\n)?[\\u200e\\u200f]+$/g, '');\n\t\t\tvar newLength = node.nodeValue.length;\n\n\t\t\tif (oldLength !== newLength) {\n\t\t\t\t// Log changes for editors benefit\n\t\t\t\tthis.env.log('warn/html2wt/bidi',\n\t\t\t\t\t'LRM/RLM unicode chars stripped around categories');\n\t\t\t}\n\n\t\t\tif (newLength === 0) {\n\t\t\t\t// Remove empty text nodes to keep DOM in normalized form\n\t\t\t\tvar ret = DOMUtils.nextNonDeletedSibling(node);\n\t\t\t\tnode.parentNode.removeChild(node);\n\t\t\t\tthis.addDiffMarks(node, 'deleted');\n\t\t\t\treturn ret;\n\t\t\t}\n\n\t\t\t// Treat modified node as having been newly inserted\n\t\t\tthis.addDiffMarks(node, 'inserted');\n\t\t}\n\t\treturn node;\n\t}\n\n\t// When an A tag is encountered, if there are format tags inside, move them outside\n\t// Also merge a single sibling A tag that is mergable\n\t// The link href and text must match for this normalization to take effect\n\tmoveFormatTagOutsideATag(node) {\n\t\tif (this.inRtTestMode || node.nodeName !== 'A') {\n\t\t\treturn node;\n\t\t}\n\n\t\tvar sibling = DOMUtils.nextNonDeletedSibling(node);\n\t\tif (sibling) {\n\t\t\tthis.normalizeSiblingPair(node, sibling);\n\t\t}\n\n\t\tvar firstChild = DOMUtils.firstNonDeletedChild(node);\n\t\tvar fcNextSibling = null;\n\t\tif (firstChild) {\n\t\t\tfcNextSibling = DOMUtils.nextNonDeletedSibling(firstChild);\n\t\t}\n\n\t\tvar blockingAttrs = [ 'color', 'style', 'class' ];\n\n\t\tif (!node.hasAttribute('href')) {\n\t\t\tthis.env.log(\"error/normalize\", \"href is missing from a tag\", node.outerHTML);\n\t\t\treturn node;\n\t\t}\n\t\tvar nodeHref = node.getAttribute('href');\n\n\t\t// If there are no tags to swap, we are done\n\t\tif (firstChild && DOMUtils.isElt(firstChild) &&\n\t\t\t// No reordering possible with multiple children\n\t\t\tfcNextSibling === null &&\n\t\t\t// Do not normalize WikiLinks with these attributes\n\t\t\t!blockingAttrs.some(function(attr) { return firstChild.hasAttribute(attr); }) &&\n\t\t\t// Compare textContent to the href, noting that this matching doesn't handle all\n\t\t\t// possible simple-wiki-link scenarios that isSimpleWikiLink in link handler tackles\n\t\t\tnode.textContent === nodeHref.replace(/\\.\\//, '')\n\t\t) {\n\t\t\tvar child;\n\t\t\twhile ((child = DOMUtils.firstNonDeletedChild(node)) && DOMUtils.isFormattingElt(child)) {\n\t\t\t\tthis.swap(node, child);\n\t\t\t}\n\t\t\treturn firstChild;\n\t\t}\n\n\t\treturn node;\n\t}\n\n\t/**\n\t * scrubWikitext normalizations implemented right now:\n\t *\n\t * 1. Tag minimization (I/B tags) in normalizeSiblingPair\n\t * 2. Strip empty headings and style tags\n\t * 3. Force SOL transparent links to serialize before/after heading\n\t * 4. Trailing spaces are migrated out of links\n\t * 5. Space is added before escapable prefixes in table cells\n\t * 6. Strip <br/> from headings\n\t * 7. Strip bidi chars around categories\n\t * 8. When an A tag is encountered, if there are format tags inside, move them outside\n\t *\n\t * The return value from this function should respect the\n\t * following contract:\n\t * - if input node is unmodified, return it.\n\t * - if input node is modified, return the new node\n\t *   that it transforms into.\n\t * If you return a node other than this, normalizations may not\n\t * apply cleanly and may be skipped.\n\t *\n\t * @param {Node} node the node to normalize\n\t * @return {Node} the normalized node\n\t */\n\tnormalizeNode(node) {\n\t\tvar dp;\n\t\tif (node.nodeName === 'TH' || node.nodeName === 'TD') {\n\t\t\tdp = DOMDataUtils.getDataParsoid(node);\n\t\t\t// Table cells (td/th) previously used the stx_v flag for single-row syntax.\n\t\t\t// Newer code uses stx flag since that is used everywhere else.\n\t\t\t// While we still have old HTML in cache / storage, accept\n\t\t\t// the stx_v flag as well.\n\t\t\t// TODO: We are at html version 1.5.0 now. Once storage\n\t\t\t// no longer has version 1.5.0 content, we can get rid of\n\t\t\t// this b/c code.\n\t\t\tif (dp.stx_v) {\n\t\t\t\t// HTML (stx='html') elements will not have the stx_v flag set\n\t\t\t\t// since the single-row syntax only applies to native-wikitext.\n\t\t\t\t// So, we can safely override it here.\n\t\t\t\tdp.stx = dp.stx_v;\n\t\t\t}\n\t\t}\n\n\t\t// The following are done only if scrubWikitext flag is enabled\n\t\tif (!this.env.scrubWikitext) {\n\t\t\treturn node;\n\t\t}\n\n\t\tvar next;\n\n\t\tif (this.env.conf.parsoid.scrubBidiChars) {\n\t\t\t// Strip bidirectional chars around categories\n\t\t\t// Note that this is being done everywhere,\n\t\t\t// not just in selser mode\n\t\t\tnext = this.stripBidiCharsAroundCategories(node);\n\t\t\tif (next !== node) {\n\t\t\t\treturn next;\n\t\t\t}\n\t\t}\n\n\t\t// Skip unmodified content\n\t\tif (this.inSelserMode && !DOMUtils.isBody(node) &&\n\t\t\t!this.inInsertedContent && !DiffUtils.hasDiffMarkers(node, this.env) &&\n\t\t\t// If orig-src is not valid, this in effect becomes\n\t\t\t// an edited node and needs normalizations applied to it.\n\t\t\tWTSUtils.origSrcValidInEditedContext(this.env, node)) {\n\t\t\treturn node;\n\t\t}\n\n\t\t// Headings\n\t\tif (/^H[1-6]$/.test(node.nodeName)) {\n\t\t\tthis.hoistLinks(node, false);\n\t\t\tthis.hoistLinks(node, true);\n\t\t\tthis.stripBRs(node);\n\t\t\treturn this.stripIfEmpty(node);\n\n\t\t// Quote tags\n\t\t} else if (Consts.WTQuoteTags.has(node.nodeName)) {\n\t\t\treturn this.stripIfEmpty(node);\n\n\t\t// Anchors\n\t\t} else if (node.nodeName === 'A') {\n\t\t\tnext = DOMUtils.nextNonDeletedSibling(node);\n\t\t\t// We could have checked for !mw:ExtLink but in\n\t\t\t// the case of links without any annotations,\n\t\t\t// the positive test is semantically safer than the\n\t\t\t// negative test.\n\t\t\tif (/^mw:WikiLink$/.test(node.getAttribute('rel') || '') && this.stripIfEmpty(node) !== node) {\n\t\t\t\treturn next;\n\t\t\t}\n\t\t\tthis.moveTrailingSpacesOut(node);\n\t\t\treturn this.moveFormatTagOutsideATag(node);\n\n\t\t// Table cells\n\t\t} else if (node.nodeName === 'TD') {\n\t\t\tdp = DOMDataUtils.getDataParsoid(node);\n\t\t\t// * HTML <td>s won't have escapable prefixes\n\t\t\t// * First cell should always be checked for escapable prefixes\n\t\t\t// * Second and later cells in a wikitext td row (with stx='row' flag)\n\t\t\t//   won't have escapable prefixes.\n\t\t\tif (dp.stx === 'html' ||\n\t\t\t\t(DOMUtils.firstNonSepChild(node.parentNode) !== node && dp.stx === 'row')) {\n\t\t\t\treturn node;\n\t\t\t}\n\n\t\t\tvar first = DOMUtils.firstNonDeletedChild(node);\n\t\t\t// Emit a space before escapable prefix\n\t\t\t// This is preferable to serializing with a nowiki.\n\t\t\tif (DOMUtils.isText(first) && /^[\\-+}]/.test(first.nodeValue)) {\n\t\t\t\tfirst.nodeValue = ' ' + first.nodeValue;\n\t\t\t\tthis.addDiffMarks(first, 'inserted', true);\n\t\t\t}\n\t\t\treturn node;\n\n\t\t// Font tags without any attributes\n\t\t} else if (node.nodeName === 'FONT' && DOMDataUtils.noAttrs(node)) {\n\t\t\tnext = DOMUtils.nextNonDeletedSibling(node);\n\t\t\tDOMUtils.migrateChildren(node, node.parentNode, node);\n\t\t\tnode.parentNode.removeChild(node);\n\t\t\treturn next;\n\n\t\t// T184755: Convert sequences of <p></p> nodes to sequences of\n\t\t// <br/>, <p><br/>..other content..</p>, <p><br/><p/> to ensure\n\t\t// they serialize to as many newlines as the count of <p></p> nodes.\n\t\t} else if (node.nodeName === 'P' && !WTUtils.isLiteralHTMLNode(node) &&\n\t\t\t// Don't normalize empty p-nodes that came from source\n\t\t\t// FIXME: See T210647\n\t\t\t!/\\bmw-empty-elt\\b/.test(node.getAttribute('class') || '') &&\n\t\t\t// Don't apply normalization to <p></p> nodes that\n\t\t\t// were generated through deletions or other normalizations.\n\t\t\t// FIXME: This trick fails for non-selser mode since\n\t\t\t// diff markers are only added in selser mode.\n\t\t\tDOMUtils.hasNChildren(node, 0, true) &&\n\t\t\t// FIXME: Also, skip if this is the only child.\n\t\t\t// Eliminates spurious test failures in non-selser mode.\n\t\t\t!DOMUtils.hasNChildren(node.parentNode, 1)\n\t\t) {\n\t\t\tlet brParent, brSibling;\n\t\t\tconst br = node.ownerDocument.createElement('br');\n\t\t\tnext = DOMUtils.nextNonSepSibling(node);\n\t\t\tif (next && next.nodeName === 'P' && !WTUtils.isLiteralHTMLNode(next)) {\n\t\t\t\t// Replace 'node' (<p></p>) with a <br/> and make it the\n\t\t\t\t// first child of 'next' (<p>..</p>). If 'next' was actually\n\t\t\t\t// a <p></p> (i.e. empty), 'next' becomes <p><br/></p>\n\t\t\t\t// which will serialize to 2 newlines.\n\t\t\t\tbrParent = next;\n\t\t\t\tbrSibling = next.firstChild;\n\t\t\t} else {\n\t\t\t\t// We cannot merge the <br/> with 'next' because it\n\t\t\t\t// is not a <p>..</p>.\n\t\t\t\tbrParent = node.parentNode;\n\t\t\t\tbrSibling = node;\n\t\t\t}\n\n\t\t\t// Insert <br/>\n\t\t\tbrParent.insertBefore(br, brSibling);\n\t\t\t// Avoid nested insertion markers\n\t\t\tif (brParent === next && !isInsertedContent(brParent, this.env)) {\n\t\t\t\tthis.addDiffMarks(br, 'inserted');\n\t\t\t}\n\n\t\t\t// Delete node\n\t\t\tthis.addDiffMarks(node.parentNode, 'deleted');\n\t\t\tnode.parentNode.removeChild(node);\n\n\t\t\treturn next;\n\n\t\t// Default\n\t\t} else {\n\t\t\treturn node;\n\t\t}\n\t}\n\n\tnormalizeSiblingPair(a, b) {\n\t\tif (!rewriteablePair(this.env, a, b)) {\n\t\t\treturn b;\n\t\t}\n\n\t\t// Since 'a' and 'b' make a rewriteable tag-pair, we are good to go.\n\t\tif (mergable(a, b)) {\n\t\t\ta = this.merge(a, b);\n\t\t\t// The new a's children have new siblings. So let's look\n\t\t\t// at a again. But their grandkids haven't changed,\n\t\t\t// so we don't need to recurse further.\n\t\t\tthis.processSubtree(a, false);\n\t\t\treturn a;\n\t\t}\n\n\t\tif (swappable(a, b)) {\n\t\t\ta = this.merge(this.swap(a, DOMUtils.firstNonDeletedChild(a)), b);\n\t\t\t// Again, a has new children, but the grandkids have already\n\t\t\t// been minimized.\n\t\t\tthis.processSubtree(a, false);\n\t\t\treturn a;\n\t\t}\n\n\t\tif (swappable(b, a)) {\n\t\t\ta = this.merge(a, this.swap(b, DOMUtils.firstNonDeletedChild(b)));\n\t\t\t// Again, a has new children, but the grandkids have already\n\t\t\t// been minimized.\n\t\t\tthis.processSubtree(a, false);\n\t\t\treturn a;\n\t\t}\n\n\t\treturn b;\n\t}\n\n\tprocessSubtree(node, recurse) {\n\t\t// Process the first child outside the loop.\n\t\tvar a = DOMUtils.firstNonDeletedChild(node);\n\t\tif (!a) {\n\t\t\treturn;\n\t\t}\n\n\t\ta = this.processNode(a, recurse);\n\t\twhile (a) {\n\t\t\t// We need a pair of adjacent siblings for tag minimization.\n\t\t\tvar b = DOMUtils.nextNonDeletedSibling(a);\n\t\t\tif (!b) {\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\t// Process subtree rooted at 'b'.\n\t\t\tb = this.processNode(b, recurse);\n\n\t\t\t// If we skipped over a bunch of nodes in the middle,\n\t\t\t// we no longer have a pair of adjacent siblings.\n\t\t\tif (b && DOMUtils.previousNonDeletedSibling(b) === a) {\n\t\t\t\t// Process the pair.\n\t\t\t\ta = this.normalizeSiblingPair(a, b);\n\t\t\t} else {\n\t\t\t\ta = b;\n\t\t\t}\n\t\t}\n\t}\n\n\tprocessNode(node, recurse) {\n\t\t// Normalize 'node' and the subtree rooted at 'node'\n\t\t// recurse = true  => recurse and normalize subtree\n\t\t// recurse = false => assume the subtree is already normalized\n\n\t\t// Normalize node till it stabilizes\n\t\tvar next;\n\t\twhile (true) {\n\t\t\t// Skip templated content\n\t\t\twhile (node && WTUtils.isFirstEncapsulationWrapperNode(node)) {\n\t\t\t\tnode = WTUtils.skipOverEncapsulatedContent(node);\n\t\t\t}\n\n\t\t\tif (!node) {\n\t\t\t\treturn null;\n\t\t\t}\n\n\t\t\t// Set insertion marker\n\t\t\tvar insertedSubtree = DiffUtils.hasInsertedDiffMark(node, this.env);\n\t\t\tif (insertedSubtree) {\n\t\t\t\tif (this.inInsertedContent) {\n\t\t\t\t\t// Dump debugging info\n\t\t\t\t\tconsole.warn(\"--- Nested inserted dom-diff flags ---\");\n\t\t\t\t\tconsole.warn(\"Node:\", DOMUtils.isElt(node) ? ContentUtils.ppToXML(node) : node.textContent);\n\t\t\t\t\tconsole.warn(\"Node's parent:\", ContentUtils.ppToXML(node.parentNode));\n\t\t\t\t\tContentUtils.dumpDOM(node.ownerDocument.body,\n\t\t\t\t\t\t'-- DOM triggering nested inserted dom-diff flags --',\n\t\t\t\t\t\t{ storeDiffMark: true, env: this.env });\n\t\t\t\t}\n\t\t\t\t// FIXME: If this assert is removed, the above dumping code should\n\t\t\t\t// either be removed OR fixed up to remove uses of ContentUtils.ppToXML\n\t\t\t\tconsole.assert(!this.inInsertedContent, 'Found nested inserted dom-diff flags!');\n\t\t\t\tthis.inInsertedContent = true;\n\t\t\t}\n\n\t\t\t// Post-order traversal: Process subtree first, and current node after.\n\t\t\t// This lets multiple normalizations take effect cleanly.\n\t\t\tif (recurse && DOMUtils.isElt(node)) {\n\t\t\t\tthis.processSubtree(node, true);\n\t\t\t}\n\n\t\t\tnext = this.normalizeNode(node);\n\n\t\t\t// Clear insertion marker\n\t\t\tif (insertedSubtree) {\n\t\t\t\tthis.inInsertedContent = false;\n\t\t\t}\n\n\t\t\tif (next === node) {\n\t\t\t\treturn node;\n\t\t\t} else {\n\t\t\t\tnode = next;\n\t\t\t}\n\t\t}\n\n\t\tconsole.assert(false, \"Control should never get here!\");  // eslint-disable-line\n\t}\n\n\tnormalize(body) {\n\t\treturn this.processNode(body, true);\n\t}\n}\n\nif (typeof module === 'object') {\n\tmodule.exports.DOMNormalizer = DOMNormalizer;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/html2wt/DiffUtils.js","messages":[{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":14,"column":null,"nodeType":"Block","endLine":14,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'MWParserEnvironment' is undefined.","line":15,"column":null,"nodeType":"Block","endLine":15,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":25,"column":2,"nodeType":"Block","endLine":30,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":28,"column":null,"nodeType":"Block","endLine":28,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'MWParserEnvironment' is undefined.","line":29,"column":null,"nodeType":"Block","endLine":29,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":54,"column":2,"nodeType":"Block","endLine":59,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":58,"column":null,"nodeType":"Block","endLine":58,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":91,"column":null,"nodeType":"Block","endLine":91,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'MWParserEnvironment' is undefined.","line":92,"column":null,"nodeType":"Block","endLine":92,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":117,"column":null,"nodeType":"Block","endLine":117,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Element' is undefined.","line":119,"column":null,"nodeType":"Block","endLine":119,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":128,"column":2,"nodeType":"Block","endLine":135,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":131,"column":null,"nodeType":"Block","endLine":131,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":132,"column":null,"nodeType":"Block","endLine":132,"endColumn":null}],"errorCount":0,"warningCount":14,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/**\n * @module\n */\n\n'use strict';\n\nconst { DOMDataUtils } = require('../utils/DOMDataUtils.js');\nconst { DOMUtils } = require('../utils/DOMUtils.js');\n\nclass DiffUtils {\n\t/**\n\t * Get a node's diff marker.\n\t *\n\t * @param {Node} node\n\t * @param {MWParserEnvironment} env\n\t * @return {Object|null}\n\t */\n\tstatic getDiffMark(node, env) {\n\t\tif (!DOMUtils.isElt(node)) { return null; }\n\t\tvar data = DOMDataUtils.getNodeData(node);\n\t\tvar dpd = data.parsoid_diff;\n\t\treturn dpd && dpd.id === env.page.id ? dpd : null;\n\t}\n\n\t/**\n\t * Check that the diff markers on the node exist and are recent.\n\t *\n\t * @param {Node} node\n\t * @param {MWParserEnvironment} env\n\t */\n\tstatic hasDiffMarkers(node, env) {\n\t\treturn this.getDiffMark(node, env) !== null || DOMUtils.isDiffMarker(node);\n\t}\n\n\tstatic hasDiffMark(node, env, mark) {\n\t\t// For 'deletion' and 'insertion' markers on non-element nodes,\n\t\t// a mw:DiffMarker meta is added\n\t\tif (mark === 'deleted' || (mark === 'inserted' && !DOMUtils.isElt(node))) {\n\t\t\treturn DOMUtils.isDiffMarker(node.previousSibling, mark);\n\t\t} else {\n\t\t\tvar diffMark = this.getDiffMark(node, env);\n\t\t\treturn diffMark && diffMark.diff.indexOf(mark) >= 0;\n\t\t}\n\t}\n\n\tstatic hasInsertedDiffMark(node, env) {\n\t\treturn this.hasDiffMark(node, env, 'inserted');\n\t}\n\n\tstatic maybeDeletedNode(node) {\n\t\treturn node && DOMUtils.isElt(node) && DOMUtils.isDiffMarker(node, 'deleted');\n\t}\n\n\t/**\n\t * Is node a mw:DiffMarker node that represents a deleted block node?\n\t * This annotation is added by the DOMDiff pass.\n\t *\n\t * @param node\n\t */\n\tstatic isDeletedBlockNode(node) {\n\t\treturn this.maybeDeletedNode(node) && node.hasAttribute('data-is-block');\n\t}\n\n\tstatic directChildrenChanged(node, env) {\n\t\treturn this.hasDiffMark(node, env, 'children-changed');\n\t}\n\n\tstatic onlySubtreeChanged(node, env) {\n\t\tvar dmark = this.getDiffMark(node, env);\n\t\treturn dmark && dmark.diff.every(function subTreechangeMarker(mark) {\n\t\t\treturn mark === 'subtree-changed' || mark === 'children-changed';\n\t\t});\n\t}\n\n\tstatic addDiffMark(node, env, mark) {\n\t\tif (mark === 'deleted' || mark === 'moved') {\n\t\t\tthis.prependTypedMeta(node, 'mw:DiffMarker/' + mark);\n\t\t} else if (DOMUtils.isText(node) || DOMUtils.isComment(node)) {\n\t\t\tif (mark !== 'inserted') {\n\t\t\t\tenv.log(\"error\", \"BUG! CHANGE-marker for \", node.nodeType, \" node is: \", mark);\n\t\t\t}\n\t\t\tthis.prependTypedMeta(node, 'mw:DiffMarker/' + mark);\n\t\t} else {\n\t\t\tthis.setDiffMark(node, env, mark);\n\t\t}\n\t}\n\n\t/**\n\t * Set a diff marker on a node.\n\t *\n\t * @param {Node} node\n\t * @param {MWParserEnvironment} env\n\t * @param {string} change\n\t */\n\tstatic setDiffMark(node, env, change) {\n\t\tif (!DOMUtils.isElt(node)) { return; }\n\t\tvar dpd = this.getDiffMark(node, env);\n\t\tif (dpd) {\n\t\t\t// Diff is up to date, append this change if it doesn't already exist\n\t\t\tif (dpd.diff.indexOf(change) === -1) {\n\t\t\t\tdpd.diff.push(change);\n\t\t\t}\n\t\t} else {\n\t\t\t// Was an old diff entry or no diff at all, reset\n\t\t\tdpd = {\n\t\t\t\t// The base page revision this change happened on\n\t\t\t\tid: env.page.id,\n\t\t\t\tdiff: [change],\n\t\t\t};\n\t\t}\n\t\tDOMDataUtils.getNodeData(node).parsoid_diff = dpd;\n\t}\n\n\t/**\n\t * Insert a meta element with the passed-in typeof attribute before a node.\n\t *\n\t * @param {Node} node\n\t * @param {string} type\n\t * @return {Element} The new meta.\n\t */\n\tstatic prependTypedMeta(node, type) {\n\t\tvar meta = node.ownerDocument.createElement('meta');\n\t\tmeta.setAttribute('typeof', type);\n\t\tnode.parentNode.insertBefore(meta, node);\n\t\treturn meta;\n\t}\n\n\t/**\n\t * Attribute equality test.\n\t *\n\t * @param {Node} nodeA\n\t * @param {Node} nodeB\n\t * @param {Set} [ignoreableAttribs] Set of attributes that should be ignored.\n\t * @param {Map} [specializedAttribHandlers] Map of attributes with specialized equals handlers.\n\t */\n\tstatic attribsEquals(nodeA, nodeB, ignoreableAttribs, specializedAttribHandlers) {\n\t\tif (!ignoreableAttribs) {\n\t\t\tignoreableAttribs = new Set();\n\t\t}\n\t\tif (!specializedAttribHandlers) {\n\t\t\tspecializedAttribHandlers = new Map();\n\t\t}\n\n\t\tfunction arrayToHash(node) {\n\t\t\tvar attrs = node.attributes || [];\n\t\t\tvar h = {};\n\t\t\tvar count = 0;\n\t\t\tfor (var j = 0, n = attrs.length; j < n; j++) {\n\t\t\t\tvar a = attrs.item(j);\n\t\t\t\tif (!ignoreableAttribs.has(a.name)) {\n\t\t\t\t\tcount++;\n\t\t\t\t\th[a.name] = a.value;\n\t\t\t\t}\n\t\t\t}\n\t\t\t// If there's no special attribute handler, we want a straight\n\t\t\t// comparison of these.\n\t\t\tif (!ignoreableAttribs.has('data-parsoid')) {\n\t\t\t\th['data-parsoid'] = DOMDataUtils.getDataParsoid(node);\n\t\t\t\tcount++;\n\t\t\t}\n\t\t\tif (!ignoreableAttribs.has('data-mw') && DOMDataUtils.validDataMw(node)) {\n\t\t\t\th['data-mw'] = DOMDataUtils.getDataMw(node);\n\t\t\t\tcount++;\n\t\t\t}\n\t\t\treturn { h: h, count: count };\n\t\t}\n\n\t\tvar xA = arrayToHash(nodeA);\n\t\tvar xB = arrayToHash(nodeB);\n\n\t\tif (xA.count !== xB.count) {\n\t\t\treturn false;\n\t\t}\n\n\t\tvar hA = xA.h;\n\t\tvar keysA = Object.keys(hA).sort();\n\t\tvar hB = xB.h;\n\t\tvar keysB = Object.keys(hB).sort();\n\n\t\tfor (var i = 0; i < xA.count; i++) {\n\t\t\tvar k = keysA[i];\n\t\t\tif (k !== keysB[i]) {\n\t\t\t\treturn false;\n\t\t\t}\n\n\t\t\tvar attribEquals = specializedAttribHandlers.get(k);\n\t\t\tif (attribEquals) {\n\t\t\t\t// Use a specialized compare function, if provided\n\t\t\t\tif (!hA[k] || !hB[k] || !attribEquals(nodeA, hA[k], nodeB, hB[k])) {\n\t\t\t\t\treturn false;\n\t\t\t\t}\n\t\t\t} else if (hA[k] !== hB[k]) {\n\t\t\t\treturn false;\n\t\t\t}\n\t\t}\n\n\t\treturn true;\n\t}\n}\n\nif (typeof module === \"object\") {\n\tmodule.exports.DiffUtils = DiffUtils;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/html2wt/WTSUtils.js","messages":[{"ruleId":"jsdoc/require-returns","severity":1,"message":"Found more than one @return declaration.","line":22,"column":2,"nodeType":"Block","endLine":34,"endColumn":5},{"ruleId":"jsdoc/require-returns-check","severity":1,"message":"Found more than one @return declaration.","line":22,"column":2,"nodeType":"Block","endLine":34,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":27,"column":null,"nodeType":"Block","endLine":27,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"name\" type.","line":28,"column":null,"nodeType":"Block","endLine":28,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"curVal\" type.","line":29,"column":null,"nodeType":"Block","endLine":29,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Found more than one @return declaration.","line":67,"column":2,"nodeType":"Block","endLine":76,"endColumn":5},{"ruleId":"jsdoc/require-returns-check","severity":1,"message":"Found more than one @return declaration.","line":67,"column":2,"nodeType":"Block","endLine":76,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":70,"column":null,"nodeType":"Block","endLine":70,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":85,"column":2,"nodeType":"Block","endLine":93,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"src\" type.","line":89,"column":null,"nodeType":"Block","endLine":89,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":90,"column":null,"nodeType":"Block","endLine":90,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"state\" type.","line":91,"column":null,"nodeType":"Block","endLine":91,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"dontEmit\" type.","line":92,"column":null,"nodeType":"Block","endLine":92,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":106,"column":2,"nodeType":"Block","endLine":114,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"src\" type.","line":110,"column":null,"nodeType":"Block","endLine":110,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":111,"column":null,"nodeType":"Block","endLine":111,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"state\" type.","line":112,"column":null,"nodeType":"Block","endLine":112,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"dontEmit\" type.","line":113,"column":null,"nodeType":"Block","endLine":113,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":127,"column":2,"nodeType":"Block","endLine":135,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"origNode\" type.","line":133,"column":null,"nodeType":"Block","endLine":133,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"before\" type.","line":134,"column":null,"nodeType":"Block","endLine":134,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":169,"column":2,"nodeType":"Block","endLine":174,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":172,"column":null,"nodeType":"Block","endLine":172,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"sepNode\" type.","line":173,"column":null,"nodeType":"Block","endLine":173,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'MWParserEnvironment' is undefined.","line":209,"column":null,"nodeType":"Block","endLine":209,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":210,"column":null,"nodeType":"Block","endLine":210,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":289,"column":null,"nodeType":"Block","endLine":289,"endColumn":null}],"errorCount":0,"warningCount":27,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/** @module */\n\n\"use strict\";\n\nconst { DOMDataUtils } = require('../utils/DOMDataUtils.js');\nconst { DOMUtils } = require('../utils/DOMUtils.js');\nconst { DiffUtils } = require('./DiffUtils.js');\nconst { WTUtils } = require('../utils/WTUtils.js');\n\n/** @namespace */\nclass WTSUtils {\n\tstatic isValidSep(sep) {\n\t\treturn sep.match(/^(\\s|<!--([^\\-]|-(?!->))*-->)*$/);\n\t}\n\n\tstatic hasValidTagWidths(dsr) {\n\t\treturn dsr &&\n\t\t\ttypeof (dsr[2]) === 'number' && dsr[2] >= 0 &&\n\t\t\ttypeof (dsr[3]) === 'number' && dsr[3] >= 0;\n\t}\n\n\t/**\n\t * For new elements, attrs are always considered modified.  However, For\n\t * old elements, we only consider an attribute modified if we have shadow\n\t * info for it and it doesn't match the current value.\n\t *\n\t * @param node\n\t * @param name\n\t * @param curVal\n\t * @return {Object}\n\t * @return {any} return.value\n\t * @return {boolean} return.modified If the value of the attribute changed since we parsed the wikitext.\n\t * @return {boolean} return.fromsrc Whether we got the value from source-based roundtripping.\n\t */\n\tstatic getShadowInfo(node, name, curVal) {\n\t\tvar dp = DOMDataUtils.getDataParsoid(node);\n\n\t\t// Not the case, continue regular round-trip information.\n\t\tif (dp.a === undefined || dp.a[name] === undefined) {\n\t\t\treturn {\n\t\t\t\tvalue: curVal,\n\t\t\t\t// Mark as modified if a new element\n\t\t\t\tmodified: WTUtils.isNewElt(node),\n\t\t\t\tfromsrc: false,\n\t\t\t};\n\t\t} else if (dp.a[name] !== curVal) {\n\t\t\treturn {\n\t\t\t\tvalue: curVal,\n\t\t\t\tmodified: true,\n\t\t\t\tfromsrc: false,\n\t\t\t};\n\t\t} else if (dp.sa === undefined || dp.sa[name] === undefined) {\n\t\t\treturn {\n\t\t\t\tvalue: curVal,\n\t\t\t\tmodified: false,\n\t\t\t\tfromsrc: false,\n\t\t\t};\n\t\t} else {\n\t\t\treturn {\n\t\t\t\tvalue: dp.sa[name],\n\t\t\t\tmodified: false,\n\t\t\t\tfromsrc: true,\n\t\t\t};\n\t\t}\n\t}\n\n\t/**\n\t * Get shadowed information about an attribute on a node.\n\t *\n\t * @param {Node} node\n\t * @param {string} name\n\t * @return {Object}\n\t *   @return {any} return.value\n\t *   @return {boolean} return.modified If the value of the attribute changed since we parsed the wikitext.\n\t *   @return {boolean} return.fromsrc Whether we got the value from source-based roundtripping.\n\t */\n\tstatic getAttributeShadowInfo(node, name) {\n\t\treturn this.getShadowInfo(node, name, node.hasAttribute(name) ? node.getAttribute(name) : null);\n\t}\n\n\tstatic commentWT(comment) {\n\t\treturn '<!--' + WTUtils.decodeComment(comment) + '-->';\n\t}\n\n\t/**\n\t * Emit the start tag source when not round-trip testing, or when the node is\n\t * not marked with autoInsertedStart.\n\t *\n\t * @param src\n\t * @param node\n\t * @param state\n\t * @param dontEmit\n\t */\n\tstatic emitStartTag(src, node, state, dontEmit) {\n\t\tif (!state.rtTestMode || !DOMDataUtils.getDataParsoid(node).autoInsertedStart) {\n\t\t\tif (!dontEmit) {\n\t\t\t\tstate.emitChunk(src, node);\n\t\t\t}\n\t\t\treturn true;\n\t\t} else {\n\t\t\t// drop content\n\t\t\treturn false;\n\t\t}\n\t}\n\n\t/**\n\t * Emit the start tag source when not round-trip testing, or when the node is\n\t * not marked with autoInsertedStart.\n\t *\n\t * @param src\n\t * @param node\n\t * @param state\n\t * @param dontEmit\n\t */\n\tstatic emitEndTag(src, node, state, dontEmit) {\n\t\tif (!state.rtTestMode || !DOMDataUtils.getDataParsoid(node).autoInsertedEnd) {\n\t\t\tif (!dontEmit) {\n\t\t\t\tstate.emitChunk(src, node);\n\t\t\t}\n\t\t\treturn true;\n\t\t} else {\n\t\t\t// drop content\n\t\t\treturn false;\n\t\t}\n\t}\n\n\t/**\n\t * In wikitext, did origNode occur next to a block node which has been\n\t * deleted? While looking for next, we look past DOM nodes that are\n\t * transparent in rendering. (See emitsSolTransparentSingleLineWT for\n\t * which nodes.)\n\t *\n\t * @param origNode\n\t * @param before\n\t */\n\tstatic nextToDeletedBlockNodeInWT(origNode, before) {\n\t\tif (!origNode || DOMUtils.isBody(origNode)) {\n\t\t\treturn false;\n\t\t}\n\n\t\twhile (true) {\n\t\t\t// Find the nearest node that shows up in HTML (ignore nodes that show up\n\t\t\t// in wikitext but don't affect sol-state or HTML rendering -- note that\n\t\t\t// whitespace is being ignored, but that whitespace occurs between block nodes).\n\t\t\tvar node = origNode;\n\t\t\tdo {\n\t\t\t\tnode = before ? node.previousSibling : node.nextSibling;\n\t\t\t\tif (DiffUtils.maybeDeletedNode(node)) {\n\t\t\t\t\treturn DiffUtils.isDeletedBlockNode(node);\n\t\t\t\t}\n\t\t\t} while (node && WTUtils.emitsSolTransparentSingleLineWT(node));\n\n\t\t\tif (node) {\n\t\t\t\treturn false;\n\t\t\t} else {\n\t\t\t\t// Walk up past zero-width wikitext parents\n\t\t\t\tnode = origNode.parentNode;\n\t\t\t\tif (!WTUtils.isZeroWidthWikitextElt(node)) {\n\t\t\t\t\t// If the parent occupies space in wikitext,\n\t\t\t\t\t// clearly, we are not next to a deleted block node!\n\t\t\t\t\t// We'll eventually hit BODY here and return.\n\t\t\t\t\treturn false;\n\t\t\t\t}\n\t\t\t\torigNode = node;\n\t\t\t}\n\t\t}\n\t}\n\n\t/**\n\t * Check if whitespace preceding this node would NOT trigger an indent-pre.\n\t *\n\t * @param node\n\t * @param sepNode\n\t */\n\tstatic precedingSpaceSuppressesIndentPre(node, sepNode) {\n\t\tif (node !== sepNode && DOMUtils.isText(node)) {\n\t\t\t// if node is the same as sepNode, then the separator text\n\t\t\t// at the beginning of it has been stripped out already, and\n\t\t\t// we cannot use it to test it for indent-pre safety\n\t\t\treturn node.nodeValue.match(/^[ \\t]*\\n/);\n\t\t} else if (node.nodeName === 'BR') {\n\t\t\treturn true;\n\t\t} else if (WTUtils.isFirstEncapsulationWrapperNode(node)) {\n\t\t\t// Dont try any harder than this\n\t\t\treturn (!node.hasChildNodes()) || node.innerHTML.match(/^\\n/);\n\t\t} else {\n\t\t\treturn WTUtils.isBlockNodeWithVisibleWT(node);\n\t\t}\n\t}\n\n\tstatic traceNodeName(node) {\n\t\tswitch (node.nodeType) {\n\t\t\tcase node.ELEMENT_NODE:\n\t\t\t\treturn DOMUtils.isDiffMarker(node) ?\n\t\t\t\t\t\"DIFF_MARK\" : \"NODE: \" + node.nodeName;\n\t\t\tcase node.TEXT_NODE:\n\t\t\t\treturn \"TEXT: \" + JSON.stringify(node.nodeValue);\n\t\t\tcase node.COMMENT_NODE:\n\t\t\t\treturn \"CMT : \" + JSON.stringify(WTSUtils.commentWT(node.nodeValue));\n\t\t\tdefault:\n\t\t\t\treturn node.nodeName;\n\t\t}\n\t}\n\n\t/**\n\t * In selser mode, check if an unedited node's wikitext from source wikitext\n\t * is reusable as is.\n\t *\n\t * @param {MWParserEnvironment} env\n\t * @param {Node} node\n\t * @return {boolean}\n\t */\n\tstatic origSrcValidInEditedContext(env, node) {\n\t\tvar prev;\n\n\t\tif (WTUtils.isRedirectLink(node)) {\n\t\t\treturn DOMUtils.isBody(node.parentNode) && !node.previousSibling;\n\t\t} else if (node.nodeName === 'TH' || node.nodeName === 'TD') {\n\t\t\t// The wikitext representation for them is dependent\n\t\t\t// on cell position (first cell is always single char).\n\n\t\t\t// If there is no previous sibling, nothing to worry about.\n\t\t\tprev = node.previousSibling;\n\t\t\tif (!prev) {\n\t\t\t\treturn true;\n\t\t\t}\n\n\t\t\t// If previous sibling is unmodified, nothing to worry about.\n\t\t\tif (!DOMUtils.isDiffMarker(prev) &&\n\t\t\t\t!DiffUtils.hasInsertedDiffMark(prev, env) &&\n\t\t\t\t!DiffUtils.directChildrenChanged(prev, env)) {\n\t\t\t\treturn true;\n\t\t\t}\n\n\t\t\t// If it didn't have a stx marker that indicated that the cell\n\t\t\t// showed up on the same line via the \"||\" or \"!!\" syntax, nothing\n\t\t\t// to worry about.\n\t\t\treturn DOMDataUtils.getDataParsoid(node).stx !== 'row';\n\t\t} else if (node.nodeName === 'TR' && !DOMDataUtils.getDataParsoid(node).startTagSrc) {\n\t\t\t// If this <tr> didn't have a startTagSrc, it would have been\n\t\t\t// the first row of a table in original wikitext. So, it is safe\n\t\t\t// to reuse the original source for the row (without a \"|-\") as long as\n\t\t\t// it continues to be the first row of the table.  If not, since we need to\n\t\t\t// insert a \"|-\" to separate it from the newly added row (in an edit),\n\t\t\t// we cannot simply reuse orig. wikitext for this <tr>.\n\t\t\treturn !DOMUtils.previousNonSepSibling(node);\n\t\t} else if (DOMUtils.isNestedListOrListItem(node)) {\n\t\t\t// If there are no previous siblings, bullets were assigned to\n\t\t\t// containing elements in the ext.core.ListHandler. For example,\n\t\t\t//\n\t\t\t//   *** a\n\t\t\t//\n\t\t\t// Will assign bullets as,\n\t\t\t//\n\t\t\t//   <ul><li-*>\n\t\t\t//     <ul><li-*>\n\t\t\t//       <ul><li-*> a</li></ul>\n\t\t\t//     </li></ul>\n\t\t\t//   </li></ul>\n\t\t\t//\n\t\t\t// If we reuse the src for the inner li with the a, we'd be missing\n\t\t\t// two bullets because the tag handler for lists in the serializer only\n\t\t\t// emits start tag src when it hits a first child that isn't a list\n\t\t\t// element. We need to walk up and get them.\n\t\t\tprev = node.previousSibling;\n\t\t\tif (!prev) {\n\t\t\t\treturn false;\n\t\t\t}\n\n\t\t\t// If a previous sibling was modified, we can't reuse the start dsr.\n\t\t\twhile (prev) {\n\t\t\t\tif (DOMUtils.isDiffMarker(prev) ||\n\t\t\t\t\tDiffUtils.hasInsertedDiffMark(prev, env)\n\t\t\t\t) {\n\t\t\t\t\treturn false;\n\t\t\t\t}\n\t\t\t\tprev = prev.previousSibling;\n\t\t\t}\n\n\t\t\treturn true;\n\t\t} else {\n\t\t\treturn true;\n\t\t}\n\t}\n\n\t/**\n\t * Extracts the media type from attribute string\n\t *\n\t * @param {Node} node\n\t * @return {Object}\n\t */\n\tstatic getMediaType(node) {\n\t\tconst typeOf = node.getAttribute('typeof') || '';\n\t\tconst match = typeOf.match(/(?:^|\\s)(mw:(?:Image|Video|Audio))(?:\\/(\\w*))?(?:\\s|$)/);\n\t\treturn {\n\t\t\trdfaType: match && match[1] || '',\n\t\t\tformat: match && match[2] || '',\n\t\t};\n\t}\n\n\t/**\n\t * @param {Object} dataMw\n\t * @param {string} key\n\t * @param {boolean} keep\n\t * @return {Array|null}\n\t */\n\tstatic getAttrFromDataMw(dataMw, key, keep) {\n\t\tconst arr = dataMw.attribs || [];\n\t\tconst i = arr.findIndex(a => (a[0] === key || a[0].txt === key));\n\t\tif (i < 0) { return null; }\n\t\tconst ret = arr[i];\n\t\tif (!keep && ret[1].html === undefined) {\n\t\t\tarr.splice(i, 1);\n\t\t}\n\t\treturn ret;\n\t}\n}\n\nif (typeof module === \"object\") {\n\tmodule.exports.WTSUtils = WTSUtils;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/utils/ContentUtils.js","messages":[{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":23,"column":null,"nodeType":"Block","endLine":23,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":35,"column":null,"nodeType":"Block","endLine":35,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'MWParserEnvironment' is undefined.","line":49,"column":null,"nodeType":"Block","endLine":49,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":52,"column":null,"nodeType":"Block","endLine":52,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":104,"column":null,"nodeType":"Block","endLine":104,"endColumn":null}],"errorCount":0,"warningCount":5,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/**\n * These utilities are for processing content that's generated\n * by parsing source input (ex: wikitext)\n *\n * @module\n */\n\n'use strict';\n\nrequire('../../core-upgrade.js');\n\nconst XMLSerializer = require('../wt2html/XMLSerializer.js');\n\nconst { DOMDataUtils } = require('./DOMDataUtils.js');\nconst { DOMUtils } = require('./DOMUtils.js');\nconst { Util } = require('./Util.js');\nconst { WTUtils } = require('./WTUtils.js');\n\nclass ContentUtils {\n\t/**\n\t * XML Serializer.\n\t *\n\t * @param {Node} node\n\t * @param {Object} [options] XMLSerializer options.\n\t * @return {string}\n\t */\n\tstatic toXML(node, options) {\n\t\treturn XMLSerializer.serialize(node, options).html;\n\t}\n\n\t/**\n\t * .dataobject aware XML serializer, to be used in the DOM\n\t * post-processing phase.\n\t *\n\t * @param {Node} node\n\t * @param {Object} [options]\n\t * @return {string}\n\t */\n\tstatic ppToXML(node, options) {\n\t\t// We really only want to pass along `options.keepTmp`\n\t\tDOMDataUtils.visitAndStoreDataAttribs(node, options);\n\t\treturn this.toXML(node, options);\n\t}\n\n\t/**\n\t * .dataobject aware HTML parser, to be used in the DOM\n\t * post-processing phase.\n\t *\n\t * @param {MWParserEnvironment} env\n\t * @param {string} html\n\t * @param {Object} [options]\n\t * @return {Node}\n\t */\n\tstatic ppToDOM(env, html, options) {\n\t\toptions = options || {};\n\t\tvar node = options.node;\n\t\tif (node === undefined) {\n\t\t\tnode = env.createDocument(html).body;\n\t\t} else {\n\t\t\tnode.innerHTML = html;\n\t\t}\n\t\tif (options.reinsertFosterableContent) {\n\t\t\tDOMUtils.visitDOM(node, (n, ...args) => {\n\t\t\t\t// untunnel fostered content\n\t\t\t\tconst meta = WTUtils.reinsertFosterableContent(env, n, true);\n\t\t\t\tn = (meta !== null) ? meta : n;\n\n\t\t\t\t// load data attribs\n\t\t\t\tDOMDataUtils.loadDataAttribs(n, ...args);\n\t\t\t}, options);\n\t\t} else {\n\t\t\t// load data attribs\n\t\t\tDOMDataUtils.visitAndLoadDataAttribs(node, options);\n\t\t}\n\t\treturn node;\n\t}\n\n\tstatic stripSectionTagsAndFallbackIds(node) {\n\t\tvar n = node.firstChild;\n\t\twhile (n) {\n\t\t\tvar next = n.nextSibling;\n\t\t\tif (DOMUtils.isElt(n)) {\n\t\t\t\t// Recurse into subtree before stripping this\n\t\t\t\tthis.stripSectionTagsAndFallbackIds(n);\n\n\t\t\t\t// Strip <section> tags\n\t\t\t\tif (WTUtils.isParsoidSectionTag(n)) {\n\t\t\t\t\tDOMUtils.migrateChildren(n, n.parentNode, n);\n\t\t\t\t\tn.parentNode.removeChild(n);\n\t\t\t\t}\n\n\t\t\t\t// Strip <span typeof='mw:FallbackId' ...></span>\n\t\t\t\tif (WTUtils.isFallbackIdSpan(n)) {\n\t\t\t\t\tn.parentNode.removeChild(n);\n\t\t\t\t}\n\t\t\t}\n\t\t\tn = next;\n\t\t}\n\t}\n\n\t/**\n\t * Dump the DOM with attributes.\n\t *\n\t * @param {Node} rootNode\n\t * @param {string} title\n\t * @param {Object} [options]\n\t */\n\tstatic dumpDOM(rootNode, title, options) {\n\t\toptions = options || {};\n\t\tif (options.storeDiffMark || options.dumpFragmentMap) { console.assert(options.env); }\n\n\t\tfunction cloneData(node, clone) {\n\t\t\tif (!DOMUtils.isElt(node)) { return; }\n\t\t\tvar d = DOMDataUtils.getNodeData(node);\n\t\t\tDOMDataUtils.setNodeData(clone, Util.clone(d));\n\t\t\tnode = node.firstChild;\n\t\t\tclone = clone.firstChild;\n\t\t\twhile (node) {\n\t\t\t\tcloneData(node, clone);\n\t\t\t\tnode = node.nextSibling;\n\t\t\t\tclone = clone.nextSibling;\n\t\t\t}\n\t\t}\n\n\t\tfunction emit(buf, opts) {\n\t\t\tif ('outBuffer' in opts) {\n\t\t\t\topts.outBuffer += buf.join('\\n');\n\t\t\t} else if (opts.outStream) {\n\t\t\t\topts.outStream.write(buf.join('\\n') + '\\n');\n\t\t\t} else {\n\t\t\t\tconsole.warn(buf.join('\\n'));\n\t\t\t}\n\t\t}\n\n\t\t// cloneNode doesn't clone data => walk DOM to clone it\n\t\tvar clonedRoot = rootNode.cloneNode(true);\n\t\tcloneData(rootNode, clonedRoot);\n\n\t\tvar buf = [];\n\t\tif (!options.quiet) {\n\t\t\tbuf.push('----- ' + title + ' -----');\n\t\t}\n\n\t\tbuf.push(ContentUtils.ppToXML(clonedRoot, options));\n\t\temit(buf, options);\n\n\t\t// Dump cached fragments\n\t\tif (options.dumpFragmentMap) {\n\t\t\tArray.from(options.env.fragmentMap.keys()).forEach(function(k) {\n\t\t\t\tbuf = [];\n\t\t\t\tbuf.push('='.repeat(15));\n\t\t\t\tbuf.push(\"FRAGMENT \" + k);\n\t\t\t\tbuf.push(\"\");\n\t\t\t\temit(buf, options);\n\n\t\t\t\tconst newOpts = Object.assign({}, options, { dumpFragmentMap: false, quiet: true });\n\t\t\t\tconst fragment = options.env.fragmentMap.get(k);\n\t\t\t\tContentUtils.dumpDOM(Array.isArray(fragment) ? fragment[0] : fragment, '', newOpts);\n\t\t\t});\n\t\t}\n\n\t\tif (!options.quiet) {\n\t\t\temit(['-'.repeat(title.length + 12)], options);\n\t\t}\n\t}\n}\n\nif (typeof module === \"object\") {\n\tmodule.exports.ContentUtils = ContentUtils;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/utils/DOMDataUtils.js","messages":[{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":116,"column":2,"nodeType":"Block","endLine":122,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":119,"column":null,"nodeType":"Block","endLine":119,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":143,"column":null,"nodeType":"Block","endLine":143,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":183,"column":2,"nodeType":"Block","endLine":190,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":186,"column":null,"nodeType":"Block","endLine":186,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"type\" type.","line":187,"column":null,"nodeType":"Block","endLine":187,"endColumn":null},{"ruleId":"jsdoc/check-param-names","severity":1,"message":"Duplicate @param \"node\"","line":188,"column":null,"nodeType":"Block","endLine":188,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":188,"column":null,"nodeType":"Block","endLine":188,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"type\" type.","line":189,"column":null,"nodeType":"Block","endLine":189,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":207,"column":null,"nodeType":"Block","endLine":207,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"type\" type.","line":208,"column":null,"nodeType":"Block","endLine":208,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":232,"column":null,"nodeType":"Block","endLine":232,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"type\" type.","line":233,"column":null,"nodeType":"Block","endLine":233,"endColumn":null},{"ruleId":"jsdoc/check-param-names","severity":1,"message":"Duplicate @param \"node\"","line":234,"column":null,"nodeType":"Block","endLine":234,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":234,"column":null,"nodeType":"Block","endLine":234,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"type\" type.","line":235,"column":null,"nodeType":"Block","endLine":235,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":265,"column":null,"nodeType":"Block","endLine":265,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"env\" type.","line":266,"column":null,"nodeType":"Block","endLine":266,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"data\" type.","line":267,"column":null,"nodeType":"Block","endLine":267,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Document' is undefined.","line":294,"column":null,"nodeType":"Block","endLine":294,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Document' is undefined.","line":309,"column":null,"nodeType":"Block","endLine":309,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"doc\" type.","line":327,"column":null,"nodeType":"Block","endLine":327,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"pb\" type.","line":328,"column":null,"nodeType":"Block","endLine":328,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":384,"column":null,"nodeType":"Block","endLine":384,"endColumn":null}],"errorCount":0,"warningCount":24,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/**\n * These helpers pertain to HTML and data attributes of a node.\n *\n * @module\n */\n\n'use strict';\n\nconst semver = require('semver');\n\nconst { DOMUtils } = require('./DOMUtils.js');\nconst { JSUtils } = require('./jsutils.js');\n\nclass Bag {\n\tconstructor() {\n\t\tthis.dataobject = new Map();\n\t\tthis.docId = 0;\n\t\tthis.pagebundle = {\n\t\t\tparsoid: { counter: -1, ids: {} },\n\t\t\tmw: { ids: {} },\n\t\t};\n\t}\n\tgetPageBundle() { return this.pagebundle; }\n\tgetObject(docId) { return this.dataobject.get(docId); }\n\tstashObject(data) {\n\t\tconst docId = String(this.docId);\n\t\tthis.dataobject.set(docId, data);\n\t\tthis.docId += 1;\n\t\treturn docId;\n\t}\n}\n\nclass DOMDataUtils {\n\t// The following getters and setters load from the .dataobject store,\n\t// with the intention of eventually moving them off the nodes themselves.\n\n\t// WARNING: Don't use this directly, I guess except for in mocking.\n\t// Instead, you should be calling `env.createDocument()` if you need it.\n\tstatic setDocBag(doc, bag) {\n\t\tdoc.bag = bag || new Bag();\n\t}\n\n\tstatic DataObjectAttrName() {\n\t\treturn 'data-object-id';\n\t}\n\n\tstatic noAttrs(node) {\n\t\treturn node.attributes.length === 0 ||\n\t\t\t(node.attributes.length === 1 && node.hasAttribute(this.DataObjectAttrName()));\n\t}\n\n\tstatic getNodeData(node) {\n\t\tlet dataobject;\n\t\tif (!node.hasAttribute(this.DataObjectAttrName())) {\n\t\t\tdataobject = {};\n\t\t\tthis.setNodeData(node, dataobject);\n\t\t\treturn dataobject;\n\t\t}\n\n\t\tconst docId = node.getAttribute(this.DataObjectAttrName()) || '';\n\t\tdataobject = node.ownerDocument.bag.getObject(docId);\n\t\tconsole.assert(!dataobject.stored);\n\t\treturn dataobject;\n\t}\n\n\tstatic setNodeData(node, data) {\n\t\tconst bag = node.ownerDocument.bag;\n\t\tconst docId = bag.stashObject(data);\n\t\tnode.setAttribute(this.DataObjectAttrName(), docId);\n\t}\n\n\tstatic getDataParsoid(node) {\n\t\tvar data = this.getNodeData(node);\n\t\tif (!data.parsoid) {\n\t\t\tdata.parsoid = {};\n\t\t}\n\t\tif (!data.parsoid.tmp) {\n\t\t\tdata.parsoid.tmp = {};\n\t\t}\n\t\treturn data.parsoid;\n\t}\n\n\tstatic setDataParsoid(node, dpObj) {\n\t\tvar data = this.getNodeData(node);\n\t\tdata.parsoid = dpObj;\n\t}\n\n\tstatic getDataParsoidDiff(node) {\n\t\tvar data = this.getNodeData(node);\n\t\t// We won't set a default value on this one\n\t\treturn data.parsoid_diff;\n\t}\n\n\tstatic setDataParsoidDiff(node, diffObj) {\n\t\tvar data = this.getNodeData(node);\n\t\tdata.parsoid_diff = diffObj;\n\t}\n\n\tstatic getDataMw(node) {\n\t\tvar data = this.getNodeData(node);\n\t\tif (!data.mw) {\n\t\t\tdata.mw = {};\n\t\t}\n\t\treturn data.mw;\n\t}\n\n\tstatic setDataMw(node, dmObj) {\n\t\tvar data = this.getNodeData(node);\n\t\tdata.mw = dmObj;\n\t}\n\n\tstatic validDataMw(node) {\n\t\treturn !!Object.keys(this.getDataMw(node)).length;\n\t}\n\n\t/**\n\t * Get an object from a JSON-encoded XML attribute on a node.\n\t *\n\t * @param {Node} node\n\t * @param {string} name Name of the attribute\n\t * @param {any} defaultVal What should be returned if we fail to find a valid JSON structure\n\t */\n\tstatic getJSONAttribute(node, name, defaultVal) {\n\t\tif (!DOMUtils.isElt(node)) {\n\t\t\treturn defaultVal;\n\t\t}\n\t\tif (!node.hasAttribute(name)) {\n\t\t\treturn defaultVal;\n\t\t}\n\t\tvar attVal = node.getAttribute(name);\n\t\ttry {\n\t\t\treturn JSON.parse(attVal);\n\t\t} catch (e) {\n\t\t\tconsole.warn('ERROR: Could not decode attribute-val ' + attVal +\n\t\t\t\t\t' for ' + name + ' on node ' + node.outerHTML);\n\t\t\treturn defaultVal;\n\t\t}\n\t}\n\n\t/**\n\t * Set an attribute on a node to a JSON-encoded object.\n\t *\n\t * @param {Node} node\n\t * @param {string} name Name of the attribute.\n\t * @param {Object} obj\n\t */\n\tstatic setJSONAttribute(node, name, obj) {\n\t\tnode.setAttribute(name, JSON.stringify(obj));\n\t}\n\n\t// Similar to the method on tokens\n\tstatic setShadowInfo(node, name, val, origVal, skipOrig) {\n\t\tconsole.assert(origVal !== undefined);\n\t\tif (!skipOrig && (val === origVal || origVal === null)) { return; }\n\t\tvar dp = this.getDataParsoid(node);\n\t\tif (!dp.a) { dp.a = {}; }\n\t\tif (!dp.sa) { dp.sa = {}; }\n\t\tif (!skipOrig &&\n\t\t\t\t// FIXME: This is a hack to not overwrite already shadowed info.\n\t\t\t\t// We should either fix the call site that depends on this\n\t\t\t\t// behaviour to do an explicit check, or double down on this\n\t\t\t\t// by porting it to the token method as well.\n\t\t\t\t!dp.a.hasOwnProperty(name)) {\n\t\t\tdp.sa[name] = origVal;\n\t\t}\n\t\tdp.a[name] = val;\n\t}\n\n\tstatic addAttributes(elt, attrs) {\n\t\tObject.keys(attrs).forEach(function(k) {\n\t\t\tif (attrs[k] !== null && attrs[k] !== undefined) {\n\t\t\t\telt.setAttribute(k, attrs[k]);\n\t\t\t}\n\t\t});\n\t}\n\n\t// Similar to the method on tokens\n\tstatic addNormalizedAttribute(node, name, val, origVal, skipOrig) {\n\t\tnode.setAttribute(name, val);\n\t\tthis.setShadowInfo(node, name, val, origVal, skipOrig);\n\t}\n\n\t/**\n\t * Test if a node matches a given typeof.\n\t *\n\t * @param node\n\t * @param type\n\t * @param node\n\t * @param type\n\t */\n\tstatic hasTypeOf(node, type) {\n\t\tif (!node.getAttribute) {\n\t\t\treturn false;\n\t\t}\n\t\tvar typeOfs = node.getAttribute('typeof') || '';\n\t\treturn typeOfs.split(/\\s+/g).indexOf(type) !== -1;\n\t}\n\n\t/**\n\t * Add a type to the typeof attribute. This method works for both tokens\n\t * and DOM nodes as it only relies on getAttribute and setAttribute, which\n\t * are defined for both.\n\t * Note that getAttribute returns null for an not-present attribute for\n\t * Tokens and JavaScript nodes, but it returns an empty string for a\n\t * not-present attribute for PHP nodes.\n\t *\n\t * @param node\n\t * @param type\n\t */\n\tstatic addTypeOf(node, type) {\n\t\tvar typeOf = node.getAttribute('typeof') || '';\n\t\tif (typeOf !== '') {\n\t\t\tvar types = typeOf.split(/\\s+/g);\n\t\t\tif (types.indexOf(type) === -1) {\n\t\t\t\t// not in type set yet, so add it.\n\t\t\t\ttypes.push(type);\n\t\t\t}\n\t\t\tnode.setAttribute('typeof', types.join(' '));\n\t\t} else {\n\t\t\tnode.setAttribute('typeof', type);\n\t\t}\n\t}\n\n\t/**\n\t * Remove a type from the typeof attribute. This method works on both\n\t * tokens and DOM nodes as it only relies on\n\t * getAttribute/setAttribute/removeAttribute.\n\t * Note that getAttribute returns null for an not-present attribute for\n\t * Tokens and JavaScript nodes, but it returns an empty string for a\n\t * not-present attribute for PHP nodes.\n\t *\n\t * @param node\n\t * @param type\n\t * @param node\n\t * @param type\n\t */\n\tstatic removeTypeOf(node, type) {\n\t\tvar typeOf = node.getAttribute('typeof') || '';\n\t\tfunction notType(t) {\n\t\t\treturn t !== type;\n\t\t}\n\t\tif (typeOf !== '') {\n\t\t\tvar types = typeOf.split(/\\s+/g).filter(notType);\n\n\t\t\tif (types.length) {\n\t\t\t\tnode.setAttribute('typeof', types.join(' '));\n\t\t\t} else {\n\t\t\t\tnode.removeAttribute('typeof');\n\t\t\t}\n\t\t}\n\t}\n\n\tstatic getPageBundle(doc) {\n\t\treturn doc.bag.getPageBundle();\n\t}\n\n\t/**\n\t * Removes the `data-*` attribute from a node, and migrates the data to the\n\t * document's JSON store. Generates a unique id with the following format:\n\t * ```\n\t * mw<base64-encoded counter>\n\t * ```\n\t * but attempts to keep user defined ids.\n\t *\n\t * @param node\n\t * @param env\n\t * @param data\n\t */\n\tstatic storeInPageBundle(node, env, data) {\n\t\tvar uid = node.getAttribute('id') || '';\n\t\tvar document = node.ownerDocument;\n\t\tvar pb = this.getPageBundle(document);\n\t\tvar docDp = pb.parsoid;\n\t\tvar origId = uid || null;\n\t\tif (docDp.ids.hasOwnProperty(uid)) {\n\t\t\tuid = null;\n\t\t\t// FIXME: Protect mw ids while tokenizing to avoid false positives.\n\t\t\tenv.log('info', 'Wikitext for this page has duplicate ids: ' + origId);\n\t\t}\n\t\tif (!uid) {\n\t\t\tdo {\n\t\t\t\tdocDp.counter += 1;\n\t\t\t\tuid = 'mw' + JSUtils.counterToBase64(docDp.counter);\n\t\t\t} while (document.getElementById(uid));\n\t\t\tthis.addNormalizedAttribute(node, 'id', uid, origId);\n\t\t}\n\t\tdocDp.ids[uid] = data.parsoid;\n\t\tif (data.hasOwnProperty('mw')) {\n\t\t\tpb.mw.ids[uid] = data.mw;\n\t\t}\n\t}\n\n\t/**\n\t * @param {Document} doc\n\t * @param {Object} obj\n\t */\n\tstatic injectPageBundle(doc, obj) {\n\t\tvar pb = JSON.stringify(obj);\n\t\tvar script = doc.createElement('script');\n\t\tthis.addAttributes(script, {\n\t\t\tid: 'mw-pagebundle',\n\t\t\ttype: 'application/x-mw-pagebundle',\n\t\t});\n\t\tscript.appendChild(doc.createTextNode(pb));\n\t\tdoc.head.appendChild(script);\n\t}\n\n\t/**\n\t * @param {Document} doc\n\t * @return {Object|null}\n\t */\n\tstatic extractPageBundle(doc) {\n\t\tvar pb = null;\n\t\tvar dpScriptElt = doc.getElementById('mw-pagebundle');\n\t\tif (dpScriptElt) {\n\t\t\tdpScriptElt.parentNode.removeChild(dpScriptElt);\n\t\t\tpb = JSON.parse(dpScriptElt.text);\n\t\t}\n\t\treturn pb;\n\t}\n\n\t/**\n\t * Applies the `data-*` attributes JSON structure to the document.\n\t * Leaves `id` attributes behind -- they are used by citation\n\t * code to extract `<ref>` body from the DOM.\n\t *\n\t * @param doc\n\t * @param pb\n\t */\n\tstatic applyPageBundle(doc, pb) {\n\t\tDOMUtils.visitDOM(doc.body, (node) => {\n\t\t\tif (DOMUtils.isElt(node)) {\n\t\t\t\tvar id = node.getAttribute('id') || '';\n\t\t\t\tif (pb.parsoid.ids.hasOwnProperty(id)) {\n\t\t\t\t\tthis.setJSONAttribute(node, 'data-parsoid', pb.parsoid.ids[id]);\n\t\t\t\t}\n\t\t\t\tif (pb.mw && pb.mw.ids.hasOwnProperty(id)) {\n\t\t\t\t\t// Only apply if it isn't already set.  This means earlier\n\t\t\t\t\t// applications of the pagebundle have higher precedence,\n\t\t\t\t\t// inline data being the highest.\n\t\t\t\t\tif (!node.hasAttribute('data-mw')) {\n\t\t\t\t\t\tthis.setJSONAttribute(node, 'data-mw', pb.mw.ids[id]);\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t});\n\t}\n\n\tstatic visitAndLoadDataAttribs(node, options) {\n\t\tDOMUtils.visitDOM(node, (...args) => this.loadDataAttribs(...args), options);\n\t}\n\n\t// These are intended be used on a document after post-processing, so that\n\t// the underlying .dataobject is transparently applied (in the store case)\n\t// and reloaded (in the load case), rather than worrying about keeping\n\t// the attributes up-to-date throughout that phase.  For the most part,\n\t// using this.ppTo* should be sufficient and using these directly should be\n\t// avoided.\n\n\tstatic loadDataAttribs(node, options) {\n\t\tif (!DOMUtils.isElt(node)) { return; }\n\t\toptions = options || {};\n\t\t// Reset the node data object's stored state, since we're reloading it\n\t\tthis.setNodeData(node, {});\n\t\tvar dp = this.getJSONAttribute(node, 'data-parsoid', {});\n\t\tif (options.markNew) {\n\t\t\tif (!dp.tmp) { dp.tmp = {}; }\n\t\t\tdp.tmp.isNew = !node.hasAttribute('data-parsoid');\n\t\t}\n\t\tthis.setDataParsoid(node, dp);\n\t\tnode.removeAttribute('data-parsoid');\n\t\tthis.setDataMw(node, this.getJSONAttribute(node, 'data-mw', undefined));\n\t\tnode.removeAttribute('data-mw');\n\t\tconst dpd = this.getJSONAttribute(node, 'data-parsoid-diff', undefined);\n\t\tthis.setDataParsoidDiff(node, dpd);\n\t\tnode.removeAttribute('data-parsoid-diff');\n\t}\n\n\tstatic visitAndStoreDataAttribs(node, options) {\n\t\tDOMUtils.visitDOM(node, (...args) => this.storeDataAttribs(...args), options);\n\t}\n\n\t/**\n\t * @param {Node} node\n\t * @param {Object} [options]\n\t */\n\tstatic storeDataAttribs(node, options) {\n\t\tif (!DOMUtils.isElt(node)) { return; }\n\t\toptions = options || {};\n\t\tconsole.assert(!(options.discardDataParsoid && options.keepTmp));  // Just a sanity check\n\t\tvar dp = this.getDataParsoid(node);\n\t\t// Don't modify `options`, they're reused.\n\t\tvar discardDataParsoid = options.discardDataParsoid;\n\t\tif (dp.tmp.isNew) {\n\t\t\t// Only necessary to support the cite extension's getById,\n\t\t\t// that's already been loaded once.\n\t\t\t//\n\t\t\t// This is basically a hack to ensure that DOMUtils.isNewElt\n\t\t\t// continues to work since we effectively rely on the absence\n\t\t\t// of data-parsoid to identify new elements. But, loadDataAttribs\n\t\t\t// creates an empty {} if one doesn't exist. So, this hack\n\t\t\t// ensures that a loadDataAttribs + storeDataAttribs pair don't\n\t\t\t// dirty the node by introducing an empty data-parsoid attribute\n\t\t\t// where one didn't exist before.\n\t\t\t//\n\t\t\t// Ideally, we'll find a better solution for this edge case later.\n\t\t\tdiscardDataParsoid = true;\n\t\t}\n\t\tvar data = null;\n\t\tif (!discardDataParsoid) {\n\t\t\tif (options.keepTmp) {\n\t\t\t\t// tmp.tplRanges is used during template wrapping and not at all\n\t\t\t\t// after that. This property has DOM nodes in it and will not\n\t\t\t\t// JSON.stringify.\n\t\t\t\tdp.tmp.tplRanges = undefined;\n\t\t\t} else {\n\t\t\t\tdp.tmp = undefined;\n\t\t\t}\n\n\t\t\tif (options.storeInPageBundle) {\n\t\t\t\tdata = data || {};\n\t\t\t\tdata.parsoid = dp;\n\t\t\t} else {\n\t\t\t\tthis.setJSONAttribute(node, 'data-parsoid', dp);\n\t\t\t}\n\t\t}\n\t\t// We need to serialize diffs only under special circumstances.\n\t\t// So, do it on demand.\n\t\tif (options.storeDiffMark) {\n\t\t\tconst dpDiff = this.getDataParsoidDiff(node);\n\t\t\tif (dpDiff) {\n\t\t\t\tthis.setJSONAttribute(node, 'data-parsoid-diff', dpDiff);\n\t\t\t}\n\t\t}\n\t\t// Strip invalid data-mw attributes\n\t\tif (this.validDataMw(node)) {\n\t\t\tif (options.storeInPageBundle && options.env &&\n\t\t\t\t\t// The pagebundle didn't have data-mw before 999.x\n\t\t\t\t\tsemver.satisfies(options.env.outputContentVersion, '^999.0.0')) {\n\t\t\t\tdata = data || {};\n\t\t\t\tdata.mw = this.getDataMw(node);\n\t\t\t} else {\n\t\t\t\tthis.setJSONAttribute(node, 'data-mw', this.getDataMw(node));\n\t\t\t}\n\t\t}\n\t\t// Store pagebundle\n\t\tif (data !== null) {\n\t\t\tthis.storeInPageBundle(node, options.env, data);\n\t\t}\n\t\t// Indicate that this node's data has been stored so that if we try\n\t\t// to access it after the fact we're aware and remove the attribute\n\t\t// since it's no longer needed.\n\t\tconst nd = this.getNodeData(node);\n\t\tnd.stored = true;\n\t\tnode.removeAttribute(this.DataObjectAttrName());\n\t}\n}\n\nif (typeof module === \"object\") {\n\tmodule.exports.DOMDataUtils = DOMDataUtils;\n\tmodule.exports.Bag = Bag;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/utils/DOMTraverser.js","messages":[{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":30,"column":null,"nodeType":"Block","endLine":30,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'MWParserEnvironment' is undefined.","line":31,"column":null,"nodeType":"Block","endLine":31,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'true' is undefined.","line":34,"column":null,"nodeType":"Block","endLine":34,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'false' is undefined.","line":34,"column":null,"nodeType":"Block","endLine":34,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":34,"column":null,"nodeType":"Block","endLine":34,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'traverserHandler' is undefined.","line":44,"column":null,"nodeType":"Block","endLine":44,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":51,"column":1,"nodeType":"Block","endLine":57,"endColumn":4},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":52,"column":null,"nodeType":"Block","endLine":52,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"env\" type.","line":53,"column":null,"nodeType":"Block","endLine":53,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"atTopLevel\" type.","line":54,"column":null,"nodeType":"Block","endLine":54,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"tplInfo\" type.","line":55,"column":null,"nodeType":"Block","endLine":55,"endColumn":null},{"ruleId":"jsdoc/require-returns-check","severity":1,"message":"JSDoc @return declaration present but return expression not available in function.","line":80,"column":1,"nodeType":"Block","endLine":97,"endColumn":4},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":91,"column":null,"nodeType":"Block","endLine":91,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'MWParserEnvironment' is undefined.","line":92,"column":null,"nodeType":"Block","endLine":92,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'true' is undefined.","line":96,"column":null,"nodeType":"Block","endLine":96,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":96,"column":null,"nodeType":"Block","endLine":96,"endColumn":null}],"errorCount":0,"warningCount":16,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/**\n * Pre-order depth-first DOM traversal helper.\n *\n * @module\n */\n\n'use strict';\n\nvar DOMDataUtils = require('./DOMDataUtils.js').DOMDataUtils;\nvar DOMUtils = require('./DOMUtils.js').DOMUtils;\nvar JSUtils = require('./jsutils.js').JSUtils;\nvar WTUtils = require('./WTUtils.js').WTUtils;\n\n/**\n * Class for helping us traverse the DOM.\n *\n * This class currently does a pre-order depth-first traversal.\n * See {@link DOMPostOrder} for post-order traversal.\n *\n * @class\n */\nfunction DOMTraverser() {\n\tthis.handlers = [];\n}\n\n/**\n * DOM traversal handler.\n *\n * @callback module:utils/DOMTraverser~traverserHandler\n * @param {Node} node\n * @param {MWParserEnvironment} env\n * @param {boolean} atTopLevel\n * @param {Object} tplInfo Template information.\n * @return {Node|null|false|true}\n *   Return false if you want to stop any further callbacks from being\n *   called on the node.  Return the new node if you need to replace it or\n *   change its siblings; traversal will continue with the new node.\n */\n\n/**\n * Add a handler to the DOM traverser.\n *\n * @param {string} nodeName\n * @param {traverserHandler} action\n *   A callback, called on each node we traverse that matches nodeName.\n */\nDOMTraverser.prototype.addHandler = function(nodeName, action) {\n\tthis.handlers.push({ action, nodeName });\n};\n\n/**\n * @param node\n * @param env\n * @param atTopLevel\n * @param tplInfo\n * @private\n */\nDOMTraverser.prototype.callHandlers = function(node, env, atTopLevel, tplInfo) {\n\tvar name = (node.nodeName || '').toLowerCase();\n\n\tfor (const handler of this.handlers) {\n\t\tif (handler.nodeName === null || handler.nodeName === name) {\n\t\t\tvar result = handler.action(node, env, atTopLevel, tplInfo);\n\t\t\tif (result !== true) {\n\t\t\t\tif (result === undefined) {\n\t\t\t\t\tenv.log(\"error\",\n\t\t\t\t\t\t'DOMPostProcessor.traverse: undefined return!',\n\t\t\t\t\t\t'Bug in', handler.action.toString(),\n\t\t\t\t\t\t' when handling ', node.outerHTML);\n\t\t\t\t}\n\t\t\t\t// abort processing for this node\n\t\t\t\treturn result;\n\t\t\t}\n\t\t}\n\t}\n\n\treturn true;\n};\n\n/**\n * Traverse the DOM and fire the handlers that are registered.\n *\n * Handlers can return\n * - the next node to process\n *   - aborts processing for current node, continues with returned node\n *   - can also be `null`, so returning `workNode.nextSibling` works even when\n *     workNode is a last child of its parent\n * - `true`\n *   - continue regular processing on current node.\n *\n * @param {Node} workNode\n * @param {MWParserEnvironment} env\n * @param {Object} options\n * @param {boolean} atTopLevel\n * @param {Object} tplInfo Template information.\n * @return {Node|null|true}\n */\nDOMTraverser.prototype.traverse = function(workNode, env, options, atTopLevel, tplInfo) {\n\twhile (workNode !== null) {\n\t\tif (DOMUtils.isElt(workNode)) {\n\t\t\t// Identify the first template/extension node.\n\t\t\t// You'd think the !tplInfo check isn't necessary since\n\t\t\t// we don't have nested transclusions, however, you can\n\t\t\t// get extensions in transclusions.\n\t\t\tif (!tplInfo && WTUtils.isFirstEncapsulationWrapperNode(workNode)\n\t\t\t\t\t// Ensure this isn't just a meta marker, since we might\n\t\t\t\t\t// not be traversing after encapsulation.  Note that the\n\t\t\t\t\t// valid data-mw assertion is the same test as used in\n\t\t\t\t\t// cleanup.\n\t\t\t\t\t&& (!WTUtils.isTplMarkerMeta(workNode) || DOMDataUtils.validDataMw(workNode))\n\t\t\t\t\t// Encapsulation info on sections should not be used to\n\t\t\t\t\t// traverse with since it's designed to be dropped and\n\t\t\t\t\t// may have expanded ranges.\n\t\t\t\t\t&& !WTUtils.isParsoidSectionTag(workNode)) {\n\t\t\t\tvar about = workNode.getAttribute(\"about\") || '';\n\t\t\t\ttplInfo = {\n\t\t\t\t\tfirst: workNode,\n\t\t\t\t\tlast: JSUtils.lastItem(WTUtils.getAboutSiblings(workNode, about)),\n\t\t\t\t\tclear: false,\n\t\t\t\t};\n\t\t\t}\n\t\t}\n\n\t\t// Call the handlers on this workNode\n\t\tvar possibleNext = this.callHandlers(workNode, env, atTopLevel, tplInfo);\n\n\t\t// We may have walked passed the last about sibling or want to\n\t\t// ignore the template info in future processing.\n\t\tif (tplInfo && tplInfo.clear) {\n\t\t\ttplInfo = null;\n\t\t}\n\n\t\tif (possibleNext === true) {\n\t\t\t// the 'continue processing' case\n\t\t\tif (DOMUtils.isElt(workNode) && workNode.hasChildNodes()) {\n\t\t\t\tthis.traverse(workNode.firstChild, env, options, atTopLevel, tplInfo);\n\t\t\t}\n\t\t\tpossibleNext = workNode.nextSibling;\n\t\t}\n\n\t\t// Clear the template info after reaching the last about sibling.\n\t\tif (tplInfo && tplInfo.last === workNode) {\n\t\t\ttplInfo = null;\n\t\t}\n\n\t\tworkNode = possibleNext;\n\t}\n};\n\nif (typeof module === \"object\") {\n\tmodule.exports.DOMTraverser = DOMTraverser;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/utils/DOMUtils.js","messages":[{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":23,"column":null,"nodeType":"Block","endLine":23,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":42,"column":null,"nodeType":"Block","endLine":42,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"handler\" type.","line":43,"column":null,"nodeType":"Block","endLine":43,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"from\" type.","line":60,"column":null,"nodeType":"Block","endLine":60,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"to\" type.","line":61,"column":null,"nodeType":"Block","endLine":61,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"beforeNode\" type.","line":62,"column":null,"nodeType":"Block","endLine":62,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"from\" type.","line":79,"column":null,"nodeType":"Block","endLine":79,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"to\" type.","line":80,"column":null,"nodeType":"Block","endLine":80,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"beforeNode\" type.","line":81,"column":null,"nodeType":"Block","endLine":81,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":95,"column":2,"nodeType":"Block","endLine":100,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":99,"column":null,"nodeType":"Block","endLine":99,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":105,"column":2,"nodeType":"Block","endLine":110,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":109,"column":null,"nodeType":"Block","endLine":109,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":115,"column":2,"nodeType":"Block","endLine":120,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":119,"column":null,"nodeType":"Block","endLine":119,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":125,"column":2,"nodeType":"Block","endLine":130,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":129,"column":null,"nodeType":"Block","endLine":129,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":147,"column":2,"nodeType":"Block","endLine":158,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":155,"column":null,"nodeType":"Block","endLine":155,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"nchildren\" type.","line":156,"column":null,"nodeType":"Block","endLine":156,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"countDiffMarkers\" type.","line":157,"column":null,"nodeType":"Block","endLine":157,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":174,"column":null,"nodeType":"Block","endLine":174,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":175,"column":null,"nodeType":"Block","endLine":175,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":176,"column":null,"nodeType":"Block","endLine":176,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":191,"column":null,"nodeType":"Block","endLine":191,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":192,"column":null,"nodeType":"Block","endLine":192,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":201,"column":null,"nodeType":"Block","endLine":201,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":202,"column":null,"nodeType":"Block","endLine":202,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":204,"column":null,"nodeType":"Block","endLine":204,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":216,"column":2,"nodeType":"Block","endLine":222,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":220,"column":null,"nodeType":"Block","endLine":220,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":221,"column":null,"nodeType":"Block","endLine":221,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":230,"column":2,"nodeType":"Block","endLine":236,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":234,"column":null,"nodeType":"Block","endLine":234,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":235,"column":null,"nodeType":"Block","endLine":235,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":244,"column":2,"nodeType":"Block","endLine":249,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":247,"column":null,"nodeType":"Block","endLine":247,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":261,"column":null,"nodeType":"Block","endLine":261,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":275,"column":null,"nodeType":"Block","endLine":275,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'bool' is undefined.","line":278,"column":null,"nodeType":"Block","endLine":278,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":289,"column":null,"nodeType":"Block","endLine":289,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":306,"column":null,"nodeType":"Block","endLine":306,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'bool' is undefined.","line":309,"column":null,"nodeType":"Block","endLine":309,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":351,"column":null,"nodeType":"Block","endLine":351,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'bool' is undefined.","line":353,"column":null,"nodeType":"Block","endLine":353,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":371,"column":2,"nodeType":"Block","endLine":375,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":374,"column":null,"nodeType":"Block","endLine":374,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":386,"column":2,"nodeType":"Block","endLine":390,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":389,"column":null,"nodeType":"Block","endLine":389,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":405,"column":2,"nodeType":"Block","endLine":409,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":408,"column":null,"nodeType":"Block","endLine":408,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":429,"column":2,"nodeType":"Block","endLine":434,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":433,"column":null,"nodeType":"Block","endLine":433,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":443,"column":2,"nodeType":"Block","endLine":448,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":447,"column":null,"nodeType":"Block","endLine":447,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":485,"column":2,"nodeType":"Block","endLine":489,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":488,"column":null,"nodeType":"Block","endLine":488,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":498,"column":2,"nodeType":"Block","endLine":502,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":501,"column":null,"nodeType":"Block","endLine":501,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":511,"column":2,"nodeType":"Block","endLine":515,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":514,"column":null,"nodeType":"Block","endLine":514,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":524,"column":2,"nodeType":"Block","endLine":528,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":527,"column":null,"nodeType":"Block","endLine":527,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":537,"column":2,"nodeType":"Block","endLine":541,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":540,"column":null,"nodeType":"Block","endLine":540,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":555,"column":2,"nodeType":"Block","endLine":559,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":558,"column":null,"nodeType":"Block","endLine":558,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":571,"column":2,"nodeType":"Block","endLine":577,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":575,"column":null,"nodeType":"Block","endLine":575,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"strict\" type.","line":576,"column":null,"nodeType":"Block","endLine":576,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":594,"column":2,"nodeType":"Block","endLine":600,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":598,"column":null,"nodeType":"Block","endLine":598,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"tagName\" type.","line":599,"column":null,"nodeType":"Block","endLine":599,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":618,"column":null,"nodeType":"Block","endLine":618,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":628,"column":null,"nodeType":"Block","endLine":628,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":629,"column":null,"nodeType":"Block","endLine":629,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Document' is undefined.","line":639,"column":null,"nodeType":"Block","endLine":639,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Document' is undefined.","line":652,"column":null,"nodeType":"Block","endLine":652,"endColumn":null}],"errorCount":0,"warningCount":78,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/**\n * DOM utilities for querying the DOM. This is largely independent of Parsoid\n * although some Parsoid details (diff markers, TokenUtils, inline content version)\n * have snuck in. Trying to prevent that is probably not worth the effort yet\n * at this stage of refactoring.\n *\n * @module\n */\n\n'use strict';\n\nconst domino = require('domino');\n\nconst Consts = require('../config/WikitextConstants.js').WikitextConstants;\nconst { JSUtils } = require('./jsutils.js');\nconst { TokenUtils } = require('./TokenUtils.js');\n\nclass DOMUtils {\n\t/**\n\t * Parse HTML, return the tree.\n\t *\n\t * @param {string} html\n\t * @return {Node}\n\t */\n\tstatic parseHTML(html) {\n\t\thtml = html || '';\n\t\tif (!html.match(/^<(?:!doctype|html|body)/i)) {\n\t\t\t// Make sure that we parse fragments in the body. Otherwise comments,\n\t\t\t// link and meta tags end up outside the html element or in the head\n\t\t\t// element.\n\t\t\thtml = '<body>' + html;\n\t\t}\n\t\treturn domino.createDocument(html);\n\t}\n\n\t/**\n\t * This is a simplified version of the DOMTraverser.\n\t * Consider using that before making this more complex.\n\t *\n\t * FIXME: Move to DOMTraverser OR create a new class?\n\t *\n\t * @param node\n\t * @param handler\n\t * @param {...any} args\n\t */\n\tstatic visitDOM(node, handler, ...args) {\n\t\thandler(node, ...args);\n\t\tnode = node.firstChild;\n\t\twhile (node) {\n\t\t\tconst next = node.nextSibling;\n\t\t\tthis.visitDOM(node, handler, ...args);\n\t\t\tnode = next;\n\t\t}\n\t}\n\n\t/**\n\t * Move 'from'.childNodes to 'to' adding them before 'beforeNode'\n\t * If 'beforeNode' is null, the nodes are appended at the end.\n\t *\n\t * @param from\n\t * @param to\n\t * @param beforeNode\n\t */\n\tstatic migrateChildren(from, to, beforeNode) {\n\t\tif (beforeNode === undefined) {\n\t\t\tbeforeNode = null;\n\t\t}\n\t\twhile (from.firstChild) {\n\t\t\tto.insertBefore(from.firstChild, beforeNode);\n\t\t}\n\t}\n\n\t/**\n\t * Move 'from'.childNodes to 'to' adding them before 'beforeNode'\n\t * 'from' and 'to' belong to different documents.\n\t *\n\t * If 'beforeNode' is null, the nodes are appended at the end.\n\t *\n\t * @param from\n\t * @param to\n\t * @param beforeNode\n\t */\n\tstatic migrateChildrenBetweenDocs(from, to, beforeNode) {\n\t\tif (beforeNode === undefined) {\n\t\t\tbeforeNode = null;\n\t\t}\n\t\tvar n = from.firstChild;\n\t\tvar destDoc = to.ownerDocument;\n\t\twhile (n) {\n\t\t\tto.insertBefore(destDoc.importNode(n, true), beforeNode);\n\t\t\tn = n.nextSibling;\n\t\t}\n\t}\n\n\t/**\n\t * Check whether this is a DOM element node.\n\t *\n\t * @see http://dom.spec.whatwg.org/#dom-node-nodetype\n\t * @param {Node} node\n\t */\n\tstatic isElt(node) {\n\t\treturn node && node.nodeType === 1;\n\t}\n\n\t/**\n\t * Check whether this is a DOM text node.\n\t *\n\t * @see http://dom.spec.whatwg.org/#dom-node-nodetype\n\t * @param {Node} node\n\t */\n\tstatic isText(node) {\n\t\treturn node && node.nodeType === 3;\n\t}\n\n\t/**\n\t * Check whether this is a DOM comment node.\n\t *\n\t * @see http://dom.spec.whatwg.org/#dom-node-nodetype\n\t * @param {Node} node\n\t */\n\tstatic isComment(node) {\n\t\treturn node && node.nodeType === 8;\n\t}\n\n\t/**\n\t * Determine whether this is a block-level DOM element.\n\t *\n\t * @see TokenUtils.isBlockTag\n\t * @param {Node} node\n\t */\n\tstatic isBlockNode(node) {\n\t\treturn node && TokenUtils.isBlockTag(node.nodeName);\n\t}\n\n\tstatic isFormattingElt(node) {\n\t\treturn node && Consts.HTML.FormattingTags.has(node.nodeName);\n\t}\n\n\tstatic isQuoteElt(node) {\n\t\treturn node && Consts.WTQuoteTags.has(node.nodeName);\n\t}\n\n\tstatic isBody(node) {\n\t\treturn node && node.nodeName === 'BODY';\n\t}\n\n\t/**\n\t * Test the number of children this node has without using\n\t * `Node#childNodes.length`.  This walks the sibling list and so\n\t * takes O(`nchildren`) time -- so `nchildren` is expected to be small\n\t * (say: 0, 1, or 2).\n\t *\n\t * Skips all diff markers by default.\n\t *\n\t * @param node\n\t * @param nchildren\n\t * @param countDiffMarkers\n\t */\n\tstatic hasNChildren(node, nchildren, countDiffMarkers) {\n\t\tfor (var child = node.firstChild; child; child = child.nextSibling) {\n\t\t\tif (!countDiffMarkers && this.isDiffMarker(child)) {\n\t\t\t\tcontinue;\n\t\t\t}\n\t\t\tif (nchildren <= 0) { return false; }\n\t\t\tnchildren -= 1;\n\t\t}\n\t\treturn (nchildren === 0);\n\t}\n\n\t/**\n\t * Build path from a node to its passed-in ancestor.\n\t * Doesn't include the ancestor in the returned path.\n\t *\n\t * @param {Node} node\n\t * @param {Node} ancestor Should be an ancestor of `node`.\n\t * @return {Node[]}\n\t */\n\tstatic pathToAncestor(node, ancestor) {\n\t\tvar path = [];\n\t\twhile (node && node !== ancestor) {\n\t\t\tpath.push(node);\n\t\t\tnode = node.parentNode;\n\t\t}\n\n\t\treturn path;\n\t}\n\n\t/**\n\t * Build path from a node to the root of the document.\n\t *\n\t * @param node\n\t * @return {Node[]}\n\t */\n\tstatic pathToRoot(node) {\n\t\treturn this.pathToAncestor(node, null);\n\t}\n\n\t/**\n\t * Build path from a node to its passed-in sibling.\n\t *\n\t * @param {Node} node\n\t * @param {Node} sibling\n\t * @param {boolean} left Whether to go backwards, i.e., use previousSibling instead of nextSibling.\n\t * @return {Node[]} Will not include the passed-in sibling.\n\t */\n\tstatic pathToSibling(node, sibling, left) {\n\t\tvar path = [];\n\t\twhile (node && node !== sibling) {\n\t\t\tpath.push(node);\n\t\t\tnode = left ? node.previousSibling : node.nextSibling;\n\t\t}\n\n\t\treturn path;\n\t}\n\n\t/**\n\t * Check whether a node `n1` comes before another node `n2` in\n\t * their parent's children list.\n\t *\n\t * @param {Node} n1 The node you expect to come first.\n\t * @param {Node} n2 Expected later sibling.\n\t */\n\tstatic inSiblingOrder(n1, n2) {\n\t\twhile (n1 && n1 !== n2) {\n\t\t\tn1 = n1.nextSibling;\n\t\t}\n\t\treturn n1 !== null;\n\t}\n\n\t/**\n\t * Check that a node 'n1' is an ancestor of another node 'n2' in\n\t * the DOM. Returns true if n1 === n2.\n\t *\n\t * @param {Node} n1 The suspected ancestor.\n\t * @param {Node} n2 The suspected descendant.\n\t */\n\tstatic isAncestorOf(n1, n2) {\n\t\twhile (n2 && n2 !== n1) {\n\t\t\tn2 = n2.parentNode;\n\t\t}\n\t\treturn n2 !== null;\n\t}\n\n\t/**\n\t * Check whether `node` has an ancesor named `name`.\n\t *\n\t * @param {Node} node\n\t * @param {string} name\n\t */\n\tstatic hasAncestorOfName(node, name) {\n\t\twhile (node && node.nodeName !== name) {\n\t\t\tnode = node.parentNode;\n\t\t}\n\t\treturn node !== null;\n\t}\n\n\t/**\n\t * Determine whether the node matches the given nodeName and typeof\n\t * attribute value.\n\t *\n\t * @param {Node} n\n\t * @param {string} name node name to test for\n\t * @param {RegExp} type Expected value of \"typeof\" attribute.\n\t * @return {string|null} Matching typeof value, or `null` if the node\n\t *    doesn't match.\n\t */\n\tstatic matchNameAndTypeOf(n, name, type) {\n\t\treturn (n.nodeName === name) ? this.matchTypeOf(n, type) : null;\n\t}\n\n\t/**\n\t * Determine whether the node matches the given nodeName and typeof\n\t * attribute value; the typeof is given as string.\n\t *\n\t * @param {Node} n\n\t * @param {string} name node name to test for\n\t * @param {string} type Expected value of \"typeof\" attribute.\n\t * @return {bool} True if the node matches.\n\t */\n\tstatic hasNameAndTypeOf(n, name, type) {\n\t\treturn this.matchNameAndTypeOf(\n\t\t\tn, name, JSUtils.rejoin('^', JSUtils.escapeRegExp(type), '$')\n\t\t) !== null;\n\t}\n\n\t/**\n\t * Determine whether the node matches the given typeof attribute value.\n\t *\n\t * @param {Node} n\n\t * @param {RegExp} type Expected value of \"typeof\" attribute.\n\t * @return {string|null} Matching typeof value, or `null` if the node\n\t *    doesn't match.\n\t */\n\tstatic matchTypeOf(n, type) {\n\t\tif (!this.isElt(n)) { return null; }\n\t\tif (!n.hasAttribute('typeof')) { return null; }\n\t\tfor (const ty of n.getAttribute('typeof').split(/\\s+/g)) {\n\t\t\tif (type.test(ty)) { return ty; }\n\t\t}\n\t\treturn null;\n\t}\n\n\t/**\n\t * Determine whether the node matches the given typeof attribute value.\n\t *\n\t * @param {Node} n\n\t * @param {string} type Expected value of \"typeof\" attribute, as a literal\n\t *   string.\n\t * @return {bool} True if the node matches.\n\t */\n\tstatic hasTypeOf(n, type) {\n\t\treturn this.matchTypeOf(\n\t\t\tn, JSUtils.rejoin('^', JSUtils.escapeRegExp(type), '$')\n\t\t) !== null;\n\t}\n\n\tstatic isFosterablePosition(n) {\n\t\treturn n && Consts.HTML.FosterablePosition.has(n.parentNode.nodeName);\n\t}\n\n\tstatic isList(n) {\n\t\treturn n && Consts.HTML.ListTags.has(n.nodeName);\n\t}\n\n\tstatic isListItem(n) {\n\t\treturn n && Consts.HTML.ListItemTags.has(n.nodeName);\n\t}\n\n\tstatic isListOrListItem(n) {\n\t\treturn this.isList(n) || this.isListItem(n);\n\t}\n\n\tstatic isNestedInListItem(n) {\n\t\tvar parentNode = n.parentNode;\n\t\twhile (parentNode) {\n\t\t\tif (this.isListItem(parentNode)) {\n\t\t\t\treturn true;\n\t\t\t}\n\t\t\tparentNode = parentNode.parentNode;\n\t\t}\n\t\treturn false;\n\t}\n\n\tstatic isNestedListOrListItem(n) {\n\t\treturn (this.isList(n) || this.isListItem(n)) && this.isNestedInListItem(n);\n\t}\n\n\t/**\n\t * Check a node to see whether it's a meta with some typeof.\n\t *\n\t * @param {Node} n\n\t * @param {string} type Passed into {@link #hasNameAndTypeOf}.\n\t * @return {bool}\n\t */\n\tstatic isMarkerMeta(n, type) {\n\t\treturn this.hasNameAndTypeOf(n, \"META\", type);\n\t}\n\n\t// FIXME: This would ideally belong in DiffUtils.js\n\t// but that would introduce circular dependencies.\n\tstatic isDiffMarker(node, mark) {\n\t\tif (!node) { return false; }\n\n\t\tif (mark) {\n\t\t\treturn this.isMarkerMeta(node, 'mw:DiffMarker/' + mark);\n\t\t} else {\n\t\t\treturn node.nodeName === 'META' && /\\bmw:DiffMarker\\/\\w*\\b/.test(node.getAttribute('typeof') || '');\n\t\t}\n\t}\n\n\t/**\n\t * Check whether a node has any children that are elements.\n\t *\n\t * @param node\n\t */\n\tstatic hasElementChild(node) {\n\t\tfor (var child = node.firstChild; child; child = child.nextSibling) {\n\t\t\tif (this.isElt(child)) {\n\t\t\t\treturn true;\n\t\t\t}\n\t\t}\n\n\t\treturn false;\n\t}\n\n\t/**\n\t * Check if a node has a block-level element descendant.\n\t *\n\t * @param node\n\t */\n\tstatic hasBlockElementDescendant(node) {\n\t\tfor (var child = node.firstChild; child; child = child.nextSibling) {\n\t\t\tif (this.isElt(child) &&\n\t\t\t\t\t// Is a block-level node\n\t\t\t\t\t(this.isBlockNode(child) ||\n\t\t\t\t\t\t// or has a block-level child or grandchild or..\n\t\t\t\t\t\tthis.hasBlockElementDescendant(child))) {\n\t\t\t\treturn true;\n\t\t\t}\n\t\t}\n\n\t\treturn false;\n\t}\n\n\t/**\n\t * Is a node representing inter-element whitespace?\n\t *\n\t * @param node\n\t */\n\tstatic isIEW(node) {\n\t\t// ws-only\n\t\treturn this.isText(node) && node.nodeValue.match(/^[ \\t\\r\\n]*$/);\n\t}\n\n\tstatic isDocumentFragment(node) {\n\t\treturn node && node.nodeType === 11;\n\t}\n\n\tstatic atTheTop(node) {\n\t\treturn this.isDocumentFragment(node) || this.isBody(node);\n\t}\n\n\tstatic isContentNode(node) {\n\t\treturn !this.isComment(node) &&\n\t\t\t!this.isIEW(node) &&\n\t\t\t!this.isDiffMarker(node);\n\t}\n\n\t/**\n\t * Get the first child element or non-IEW text node, ignoring\n\t * whitespace-only text nodes, comments, and deleted nodes.\n\t *\n\t * @param node\n\t */\n\tstatic firstNonSepChild(node) {\n\t\tvar child = node.firstChild;\n\t\twhile (child && !this.isContentNode(child)) {\n\t\t\tchild = child.nextSibling;\n\t\t}\n\t\treturn child;\n\t}\n\n\t/**\n\t * Get the last child element or non-IEW text node, ignoring\n\t * whitespace-only text nodes, comments, and deleted nodes.\n\t *\n\t * @param node\n\t */\n\tstatic lastNonSepChild(node) {\n\t\tvar child = node.lastChild;\n\t\twhile (child && !this.isContentNode(child)) {\n\t\t\tchild = child.previousSibling;\n\t\t}\n\t\treturn child;\n\t}\n\n\tstatic previousNonSepSibling(node) {\n\t\tvar prev = node.previousSibling;\n\t\twhile (prev && !this.isContentNode(prev)) {\n\t\t\tprev = prev.previousSibling;\n\t\t}\n\t\treturn prev;\n\t}\n\n\tstatic nextNonSepSibling(node) {\n\t\tvar next = node.nextSibling;\n\t\twhile (next && !this.isContentNode(next)) {\n\t\t\tnext = next.nextSibling;\n\t\t}\n\t\treturn next;\n\t}\n\n\tstatic numNonDeletedChildNodes(node) {\n\t\tvar n = 0;\n\t\tvar child = node.firstChild;\n\t\twhile (child) {\n\t\t\tif (!this.isDiffMarker(child)) { // FIXME: This is ignoring both inserted/deleted\n\t\t\t\tn++;\n\t\t\t}\n\t\t\tchild = child.nextSibling;\n\t\t}\n\t\treturn n;\n\t}\n\n\t/**\n\t * Get the first non-deleted child of node.\n\t *\n\t * @param node\n\t */\n\tstatic firstNonDeletedChild(node) {\n\t\tvar child = node.firstChild;\n\t\twhile (child && this.isDiffMarker(child)) { // FIXME: This is ignoring both inserted/deleted\n\t\t\tchild = child.nextSibling;\n\t\t}\n\t\treturn child;\n\t}\n\n\t/**\n\t * Get the last non-deleted child of node.\n\t *\n\t * @param node\n\t */\n\tstatic lastNonDeletedChild(node) {\n\t\tvar child = node.lastChild;\n\t\twhile (child && this.isDiffMarker(child)) { // FIXME: This is ignoring both inserted/deleted\n\t\t\tchild = child.previousSibling;\n\t\t}\n\t\treturn child;\n\t}\n\n\t/**\n\t * Get the next non deleted sibling.\n\t *\n\t * @param node\n\t */\n\tstatic nextNonDeletedSibling(node) {\n\t\tnode = node.nextSibling;\n\t\twhile (node && this.isDiffMarker(node)) { // FIXME: This is ignoring both inserted/deleted\n\t\t\tnode = node.nextSibling;\n\t\t}\n\t\treturn node;\n\t}\n\n\t/**\n\t * Get the previous non deleted sibling.\n\t *\n\t * @param node\n\t */\n\tstatic previousNonDeletedSibling(node) {\n\t\tnode = node.previousSibling;\n\t\twhile (node && this.isDiffMarker(node)) { // FIXME: This is ignoring both inserted/deleted\n\t\t\tnode = node.previousSibling;\n\t\t}\n\t\treturn node;\n\t}\n\n\t/**\n\t * Are all children of this node text or comment nodes?\n\t *\n\t * @param node\n\t */\n\tstatic allChildrenAreTextOrComments(node) {\n\t\tvar child = node.firstChild;\n\t\twhile (child) {\n\t\t\tif (!this.isDiffMarker(child)\n\t\t\t\t&& !this.isText(child)\n\t\t\t\t&& !this.isComment(child)) {\n\t\t\t\treturn false;\n\t\t\t}\n\t\t\tchild = child.nextSibling;\n\t\t}\n\t\treturn true;\n\t}\n\n\t/**\n\t * Are all children of this node text nodes?\n\t *\n\t * @param node\n\t */\n\tstatic allChildrenAreText(node) {\n\t\tvar child = node.firstChild;\n\t\twhile (child) {\n\t\t\tif (!this.isDiffMarker(child) && !this.isText(child)) {\n\t\t\t\treturn false;\n\t\t\t}\n\t\t\tchild = child.nextSibling;\n\t\t}\n\t\treturn true;\n\t}\n\n\t/**\n\t * Does `node` contain nothing or just non-newline whitespace?\n\t * `strict` adds the condition that all whitespace is forbidden.\n\t *\n\t * @param node\n\t * @param strict\n\t */\n\tstatic nodeEssentiallyEmpty(node, strict) {\n\t\tvar n = node.firstChild;\n\t\twhile (n) {\n\t\t\tif (this.isElt(n) && !this.isDiffMarker(n)) {\n\t\t\t\treturn false;\n\t\t\t} else if (this.isText(n) &&\n\t\t\t\t\t(strict || !/^[ \\t]*$/.test(n.nodeValue))) {\n\t\t\t\treturn false;\n\t\t\t} else if (this.isComment(n)) {\n\t\t\t\treturn false;\n\t\t\t}\n\t\t\tn = n.nextSibling;\n\t\t}\n\t\treturn true;\n\t}\n\n\t/**\n\t * Check if the dom-subtree rooted at node has an element with tag name 'tagName'\n\t * The root node is not checked.\n\t *\n\t * @param node\n\t * @param tagName\n\t */\n\tstatic treeHasElement(node, tagName) {\n\t\tnode = node.firstChild;\n\t\twhile (node) {\n\t\t\tif (this.isElt(node)) {\n\t\t\t\tif (node.nodeName === tagName || this.treeHasElement(node, tagName)) {\n\t\t\t\t\treturn true;\n\t\t\t\t}\n\t\t\t}\n\t\t\tnode = node.nextSibling;\n\t\t}\n\n\t\treturn false;\n\t}\n\n\t/**\n\t * Is node a table tag (table, tbody, td, tr, etc.)?\n\t *\n\t * @param {Node} node\n\t * @return {boolean}\n\t */\n\tstatic isTableTag(node) {\n\t\treturn Consts.HTML.TableTags.has(node.nodeName);\n\t}\n\n\t/**\n\t * Returns a media element nested in `node`\n\t *\n\t * @param {Node} node\n\t * @return {Node|null}\n\t */\n\tstatic selectMediaElt(node) {\n\t\treturn node.querySelector('img, video, audio');\n\t}\n\n\t/**\n\t * Extract http-equiv headers from the HTML, including content-language and\n\t * vary headers, if present\n\t *\n\t * @param {Document} doc\n\t * @return {Object}\n\t */\n\tstatic findHttpEquivHeaders(doc) {\n\t\treturn Array.from(doc.querySelectorAll('meta[http-equiv][content]'))\n\t\t.reduce((r,el) => {\n\t\t\tr[el.getAttribute('http-equiv').toLowerCase()] =\n\t\t\t\tel.getAttribute('content');\n\t\t\treturn r;\n\t\t}, {});\n\t}\n\n\t/**\n\t * @param {Document} doc\n\t * @return {string|null}\n\t */\n\tstatic extractInlinedContentVersion(doc) {\n\t\tvar el = doc.querySelector('meta[property=\"mw:html:version\"]');\n\t\treturn el ? el.getAttribute('content') : null;\n\t}\n}\n\nif (typeof module === \"object\") {\n\tmodule.exports.DOMUtils = DOMUtils;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/utils/Diff.js","messages":[{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":16,"column":1,"nodeType":"Block","endLine":27,"endColumn":4},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"diff\" type.","line":17,"column":null,"nodeType":"Block","endLine":17,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"srcLengths\" type.","line":18,"column":null,"nodeType":"Block","endLine":18,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"outLengths\" type.","line":19,"column":null,"nodeType":"Block","endLine":19,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"diff\" type.","line":20,"column":null,"nodeType":"Block","endLine":20,"endColumn":null},{"ruleId":"jsdoc/check-param-names","severity":1,"message":"Duplicate @param \"diff\"","line":20,"column":null,"nodeType":"Block","endLine":20,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"srcLengths\" type.","line":21,"column":null,"nodeType":"Block","endLine":21,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"outLengths\" type.","line":22,"column":null,"nodeType":"Block","endLine":22,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"diff\" type.","line":23,"column":null,"nodeType":"Block","endLine":23,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"srcLengths\" type.","line":24,"column":null,"nodeType":"Block","endLine":24,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"outLengths\" type.","line":25,"column":null,"nodeType":"Block","endLine":25,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":104,"column":1,"nodeType":"Block","endLine":107,"endColumn":4},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"changes\" type.","line":105,"column":null,"nodeType":"Block","endLine":105,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":137,"column":1,"nodeType":"Block","endLine":141,"endColumn":4},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"oldString\" type.","line":138,"column":null,"nodeType":"Block","endLine":138,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"newString\" type.","line":139,"column":null,"nodeType":"Block","endLine":139,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":161,"column":1,"nodeType":"Block","endLine":165,"endColumn":4},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"oldString\" type.","line":162,"column":null,"nodeType":"Block","endLine":162,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"newString\" type.","line":163,"column":null,"nodeType":"Block","endLine":163,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":175,"column":1,"nodeType":"Block","endLine":180,"endColumn":4},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"a\" type.","line":176,"column":null,"nodeType":"Block","endLine":176,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"b\" type.","line":177,"column":null,"nodeType":"Block","endLine":177,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"options\" type.","line":178,"column":null,"nodeType":"Block","endLine":178,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":237,"column":1,"nodeType":"Block","endLine":243,"endColumn":4},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"diff\" type.","line":241,"column":null,"nodeType":"Block","endLine":241,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":320,"column":1,"nodeType":"Block","endLine":324,"endColumn":4},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"a\" type.","line":321,"column":null,"nodeType":"Block","endLine":321,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"b\" type.","line":322,"column":null,"nodeType":"Block","endLine":322,"endColumn":null}],"errorCount":0,"warningCount":28,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/**\n * Diff tools.\n *\n * @module\n */\n\n'use strict';\n\nvar simpleDiff = require('simplediff');\n\nvar Util = require('./Util.js').Util;\n\n/** @namespace */\nvar Diff = {};\n\n/**\n * @param diff\n * @param srcLengths\n * @param outLengths\n * @param diff\n * @param srcLengths\n * @param outLengths\n * @param diff\n * @param srcLengths\n * @param outLengths\n * @method\n */\nDiff.convertDiffToOffsetPairs = function(diff, srcLengths, outLengths) {\n\tvar currentPair;\n\tvar pairs = [];\n\tvar srcOff = 0;\n\tvar outOff = 0;\n\tvar srcIndex = 0;\n\tvar outIndex = 0;\n\tdiff.forEach(function(change) {\n\t\tvar pushPair = function(pair, start) {\n\t\t\tif (!pair.added) {\n\t\t\t\tpair.added = { start: start, end: start };\n\t\t\t} else if (!pair.removed) {\n\t\t\t\tpair.removed = { start: start, end: start };\n\t\t\t}\n\t\t\tpairs.push([ pair.removed, pair.added ]);\n\t\t\tcurrentPair = {};\n\t\t};\n\n\t\t// Use original line lengths;\n\t\tvar srcLen = 0;\n\t\tvar outLen = 0;\n\t\tchange[1].forEach(function() {\n\t\t\tif (change[0] === '+') {\n\t\t\t\toutLen += outLengths[outIndex];\n\t\t\t\toutIndex++;\n\t\t\t} else if (change[0] === '-') {\n\t\t\t\tsrcLen += srcLengths[srcIndex];\n\t\t\t\tsrcIndex++;\n\t\t\t} else {\n\t\t\t\tsrcLen += srcLengths[srcIndex];\n\t\t\t\toutLen += outLengths[outIndex];\n\t\t\t\tsrcIndex++;\n\t\t\t\toutIndex++;\n\t\t\t}\n\t\t});\n\n\t\tif (!currentPair) {\n\t\t\tcurrentPair = {};\n\t\t}\n\n\t\tif (change[0] === '+') {\n\t\t\tif (currentPair.added) {\n\t\t\t\tpushPair(currentPair, srcOff); // srcOff used for adding pair.removed\n\t\t\t}\n\n\t\t\tcurrentPair.added = { start: outOff };\n\t\t\toutOff += outLen;\n\t\t\tcurrentPair.added.end = outOff;\n\n\t\t\tif (currentPair.removed) {\n\t\t\t\tpushPair(currentPair);\n\t\t\t}\n\t\t} else if (change[0] === '-') {\n\t\t\tif (currentPair.removed) {\n\t\t\t\tpushPair(currentPair, outOff); // outOff used for adding pair.added\n\t\t\t}\n\n\t\t\tcurrentPair.removed = { start: srcOff };\n\t\t\tsrcOff += srcLen;\n\t\t\tcurrentPair.removed.end = srcOff;\n\n\t\t\tif (currentPair.added) {\n\t\t\t\tpushPair(currentPair);\n\t\t\t}\n\t\t} else {\n\t\t\tif (currentPair.added || currentPair.removed) {\n\t\t\t\tpushPair(currentPair, currentPair.added ? srcOff : outOff);\n\t\t\t}\n\n\t\t\tsrcOff += srcLen;\n\t\t\toutOff += outLen;\n\t\t}\n\t});\n\treturn pairs;\n};\n\n/**\n * @param changes\n * @method\n */\nDiff.convertChangesToXML = function(changes) {\n\tvar result = [];\n\tfor (var i = 0; i < changes.length; i++) {\n\t\tvar change = changes[i];\n\t\tif (change[0] === '+') {\n\t\t\tresult.push('<ins>');\n\t\t} else if (change[0] === '-') {\n\t\t\tresult.push('<del>');\n\t\t}\n\n\t\tresult.push(Util.escapeHtml(change[1].join('')));\n\n\t\tif (change[0] === '+') {\n\t\t\tresult.push('</ins>');\n\t\t} else if (change[0] === '-') {\n\t\t\tresult.push('</del>');\n\t\t}\n\t}\n\treturn result.join('');\n};\n\nvar diffTokens = function(oldString, newString, tokenize) {\n\tif (oldString === newString) {\n\t\treturn [['=', [newString]]];\n\t} else {\n\t\treturn simpleDiff.diff(tokenize(oldString), tokenize(newString));\n\t}\n};\n\n/**\n * @param oldString\n * @param newString\n * @method\n */\nDiff.diffWords = function(oldString, newString) {\n\t// This is a complicated regexp, but it improves on the naive \\b by:\n\t// * keeping tag-like things (<pre>, <a, </a>, etc) together\n\t// * keeping possessives and contractions (don't, etc) together\n\t// * ensuring that newlines always stay separate, so we don't\n\t//   have diff chunks that contain multiple newlines\n\t//   (ie, \"remove \\n\\n\" followed by \"add \\n\", instead of\n\t//   \"keep \\n\", \"remove \\n\")\n\tvar wordTokenize =\n\t\tvalue => value.split(/((?:<\\/?)?\\w+(?:'\\w+|>)?|\\s(?:(?!\\n)\\s)*)/g).filter(\n\t\t\t// For efficiency, filter out zero-length strings from token list\n\t\t\t// UGLY HACK: simplediff trips if one of tokenized words is\n\t\t\t// 'constructor'. Since this failure breaks parserTests.js runs,\n\t\t\t// work around that by hiding that diff for now.\n\t\t\ts => s !== '' && s !== 'constructor'\n\t\t);\n\treturn diffTokens(oldString, newString, wordTokenize);\n};\n\n/**\n * @param oldString\n * @param newString\n * @method\n */\nDiff.diffLines = function(oldString, newString) {\n\tvar lineTokenize = function(value) {\n\t\treturn value.split(/^/m).map(function(line) {\n\t\t\treturn line.replace(/\\r$/g, '\\n');\n\t\t});\n\t};\n\treturn diffTokens(oldString, newString, lineTokenize);\n};\n\n/**\n * @param a\n * @param b\n * @param options\n * @method\n */\nDiff.colorDiff = function(a, b, options) {\n\tconst context = options && options.context;\n\tlet diffs = 0;\n\tlet buf = '';\n\tlet before = '';\n\tconst visibleWs = s => s.replace(/[ \\xA0]/g,'\\u2423');\n\tconst funcs = (options && options.html) ? {\n\t\t'+': s => '<font color=\"green\">' + Util.escapeHtml(visibleWs(s)) + '</font>',\n\t\t'-': s => '<font color=\"red\">' + Util.escapeHtml(visibleWs(s)) + '</font>',\n\t\t'=': s => Util.escapeHtml(s),\n\t} : (options && options.noColor) ? {\n\t\t'+': s => '{+' + s + '+}',\n\t\t'-': s => '{-' + s + '-}',\n\t\t'=': s => s,\n\t} : {\n\t\t// add '' to workaround color bug; make spaces visible\n\t\t'+': s => visibleWs(s).green + '',\n\t\t'-': s => visibleWs(s).red + '',\n\t\t'=': s => s,\n\t};\n\tconst NL = (options && options.html) ? '<br/>\\n' : '\\n';\n\tconst DIFFSEP = (options && options.separator) || NL;\n\tconst visibleNL = '\\u21b5';\n\tfor (const change of Diff.diffWords(a, b)) {\n\t\tconst op = change[0];\n\t\tconst value = change[1].join('');\n\t\tif (op !== '=') {\n\t\t\tdiffs++;\n\t\t\tbuf += before;\n\t\t\tbefore = '';\n\t\t\tbuf += value.split('\\n').map((s,i,arr) => {\n\t\t\t\tif (i !== (arr.length - 1)) { s += visibleNL; }\n\t\t\t\treturn s ? funcs[op](s) : s;\n\t\t\t}).join(NL);\n\t\t} else {\n\t\t\tif (context) {\n\t\t\t\tconst lines = value.split('\\n');\n\t\t\t\tif (lines.length > 2 * (context + 1)) {\n\t\t\t\t\tconst first = lines.slice(0, context + 1).join(NL);\n\t\t\t\t\tconst last = lines.slice(lines.length - context - 1).join(NL);\n\t\t\t\t\tif (diffs > 0) {\n\t\t\t\t\t\tbuf += first + NL;\n\t\t\t\t\t}\n\t\t\t\t\tbefore = (diffs > 0 ? DIFFSEP : '') + last;\n\t\t\t\t\tcontinue;\n\t\t\t\t}\n\t\t\t}\n\t\t\tbuf += value;\n\t\t}\n\t}\n\tif (options && options.diffCount) {\n\t\treturn { count: diffs, output: buf };\n\t}\n\treturn (diffs > 0) ? buf : '';\n};\n\n/**\n * This is essentially lifted from jsDiff@1.4.0, but using our diff and\n * without the header and no newline warning.\n *\n * @param diff\n * @private\n */\nvar createPatch = function(diff) {\n\tvar ret = [];\n\n\tdiff.push({ value: '', lines: [] });  // Append an empty value to make cleanup easier\n\n\t// Formats a given set of lines for printing as context lines in a patch\n\tfunction contextLines(lines) {\n\t\treturn lines.map(function(entry) { return ' ' + entry; });\n\t}\n\n\tvar oldRangeStart = 0;\n\tvar newRangeStart = 0;\n\tvar curRange = [];\n\tvar oldLine = 1;\n\tvar newLine = 1;\n\n\tfor (var i = 0; i < diff.length; i++) {\n\t\tvar current = diff[i];\n\t\tvar lines = current.lines || current.value.replace(/\\n$/, '').split('\\n');\n\t\tcurrent.lines = lines;\n\n\t\tif (current.added || current.removed) {\n\t\t\t// If we have previous context, start with that\n\t\t\tif (!oldRangeStart) {\n\t\t\t\tvar prev = diff[i - 1];\n\t\t\t\toldRangeStart = oldLine;\n\t\t\t\tnewRangeStart = newLine;\n\n\t\t\t\tif (prev) {\n\t\t\t\t\tcurRange = contextLines(prev.lines.slice(-4));\n\t\t\t\t\toldRangeStart -= curRange.length;\n\t\t\t\t\tnewRangeStart -= curRange.length;\n\t\t\t\t}\n\t\t\t}\n\n\t\t\t// Output our changes\n\t\t\tcurRange.push.apply(curRange, lines.map(function(entry) {\n\t\t\t\treturn (current.added ? '+' : '-') + entry;\n\t\t\t}));\n\n\t\t\t// Track the updated file position\n\t\t\tif (current.added) {\n\t\t\t\tnewLine += lines.length;\n\t\t\t} else {\n\t\t\t\toldLine += lines.length;\n\t\t\t}\n\t\t} else {\n\t\t\t// Identical context lines. Track line changes\n\t\t\tif (oldRangeStart) {\n\t\t\t\t// Close out any changes that have been output (or join overlapping)\n\t\t\t\tif (lines.length <= 8 && i < diff.length - 2) {\n\t\t\t\t\t// Overlapping\n\t\t\t\t\tcurRange.push.apply(curRange, contextLines(lines));\n\t\t\t\t} else {\n\t\t\t\t\t// end the range and output\n\t\t\t\t\tvar contextSize = Math.min(lines.length, 4);\n\t\t\t\t\tret.push(\n\t\t\t\t\t\t'@@ -' + oldRangeStart + ',' + (oldLine - oldRangeStart + contextSize)\n\t\t\t\t\t\t+ ' +' + newRangeStart + ',' + (newLine - newRangeStart + contextSize)\n\t\t\t\t\t\t+ ' @@');\n\t\t\t\t\tret.push.apply(ret, curRange);\n\t\t\t\t\tret.push.apply(ret, contextLines(lines.slice(0, contextSize)));\n\n\t\t\t\t\toldRangeStart = 0;\n\t\t\t\t\tnewRangeStart = 0;\n\t\t\t\t\tcurRange = [];\n\t\t\t\t}\n\t\t\t}\n\t\t\toldLine += lines.length;\n\t\t\tnewLine += lines.length;\n\t\t}\n\t}\n\n\treturn ret.join('\\n') + '\\n';\n};\n\n/**\n * @param a\n * @param b\n * @method\n */\nDiff.patchDiff = function(a, b) {\n\t// Essentially lifted from jsDiff@1.4.0's PatchDiff.tokenize\n\tvar patchTokenize = function(value) {\n\t\tvar ret = [];\n\t\tvar linesAndNewlines = value.split(/(\\n|\\r\\n)/);\n\t\t// Ignore the final empty token that occurs if the string ends with a new line\n\t\tif (!linesAndNewlines[linesAndNewlines.length - 1]) {\n\t\t\tlinesAndNewlines.pop();\n\t\t}\n\t\t// Merge the content and line separators into single tokens\n\t\tfor (var i = 0; i < linesAndNewlines.length; i++) {\n\t\t\tvar line = linesAndNewlines[i];\n\t\t\tif (i % 2) {\n\t\t\t\tret[ret.length - 1] += line;\n\t\t\t} else {\n\t\t\t\tret.push(line);\n\t\t\t}\n\t\t}\n\t\treturn ret;\n\t};\n\tvar diffs = 0;\n\tvar diff = diffTokens(a, b, patchTokenize)\n\t.map(function(change) {\n\t\tvar value = change[1].join('');\n\t\tswitch (change[0]) {\n\t\t\tcase '+':\n\t\t\t\tdiffs++;\n\t\t\t\treturn { value: value, added: true };\n\t\t\tcase '-':\n\t\t\t\tdiffs++;\n\t\t\t\treturn { value: value, removed: true };\n\t\t\tdefault:\n\t\t\t\treturn { value: value };\n\t\t}\n\t});\n\tif (!diffs) { return null; }\n\treturn createPatch(diff);\n};\n\nif (typeof module === \"object\") {\n\tmodule.exports.Diff = Diff;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/utils/TokenUtils.js","messages":[{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":12,"column":2,"nodeType":"Block","endLine":19,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"name\" type.","line":18,"column":null,"nodeType":"Block","endLine":18,"endColumn":null}],"errorCount":0,"warningCount":2,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/**\n * @module\n */\n\n'use strict';\n\nrequire('../../core-upgrade.js');\n\nvar Consts = require('../config/WikitextConstants.js').WikitextConstants;\n\nvar TokenUtils = {\n\t/**\n\t * Determine if a tag is block-level or not.\n\t *\n\t * `<video>` is removed from block tags, since it can be phrasing content.\n\t * This is necessary for it to render inline.\n\t *\n\t * @param name\n\t */\n\tisBlockTag: function(name) {\n\t\tname = name.toUpperCase();\n\t\treturn name !== 'VIDEO' && Consts.HTML.HTML4BlockTags.has(name);\n\t},\n\n\tisDOMFragmentType: function(typeOf) {\n\t\treturn /(?:^|\\s)mw:DOMFragment(\\/sealed\\/\\w+)?(?=$|\\s)/.test(typeOf);\n\t},\n\n\t/** @property {RegExp} */\n\tsolTransparentLinkRegexp: /(?:^|\\s)mw:PageProp\\/(?:Category|redirect|Language)(?=$|\\s)/,\n};\n\nif (typeof module === \"object\") {\n\tmodule.exports.TokenUtils = TokenUtils;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/utils/Util.js","messages":[{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":59,"column":2,"nodeType":"Block","endLine":63,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"name\" type.","line":62,"column":null,"nodeType":"Block","endLine":62,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":129,"column":2,"nodeType":"Block","endLine":138,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"str\" type.","line":134,"column":null,"nodeType":"Block","endLine":134,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"idx\" type.","line":135,"column":null,"nodeType":"Block","endLine":135,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"str\" type.","line":136,"column":null,"nodeType":"Block","endLine":136,"endColumn":null},{"ruleId":"jsdoc/check-param-names","severity":1,"message":"Duplicate @param \"str\"","line":136,"column":null,"nodeType":"Block","endLine":136,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"idx\" type.","line":137,"column":null,"nodeType":"Block","endLine":137,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":160,"column":2,"nodeType":"Block","endLine":168,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"str\" type.","line":167,"column":null,"nodeType":"Block","endLine":167,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":173,"column":2,"nodeType":"Block","endLine":184,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"txt\" type.","line":183,"column":null,"nodeType":"Block","endLine":183,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":260,"column":2,"nodeType":"Block","endLine":264,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Title' is undefined.","line":298,"column":null,"nodeType":"Block","endLine":298,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Title' is undefined.","line":299,"column":null,"nodeType":"Block","endLine":299,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Namespace' is undefined.","line":313,"column":null,"nodeType":"Block","endLine":313,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Namespace' is undefined.","line":314,"column":null,"nodeType":"Block","endLine":314,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":432,"column":2,"nodeType":"Block","endLine":438,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"linkTarget\" type.","line":436,"column":null,"nodeType":"Block","endLine":436,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"env\" type.","line":437,"column":null,"nodeType":"Block","endLine":437,"endColumn":null}],"errorCount":0,"warningCount":20,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/**\n * This file contains general utilities for token transforms.\n *\n * @module\n */\n\n'use strict';\n\nrequire('../../core-upgrade.js');\n\nvar crypto = require('crypto');\nvar entities = require('entities');\nvar Consts = require('../config/WikitextConstants.js').WikitextConstants;\n\n/**\n * @namespace\n */\nvar Util = {\n\n\t// Non-global and global versions of regexp for use everywhere\n\tCOMMENT_REGEXP: /<!--(?:[^-]|-(?!->))*-->/,\n\tCOMMENT_REGEXP_G: /<!--(?:[^-]|-(?!->))*-->/g,\n\n\t/**\n\t * Update only those properties that are undefined or null in the target.\n\t *\n\t * @param {Object} tgt The object to modify.\n\t * @param {...Object} subject The object to extend tgt with. Add more arguments to the function call to chain more extensions.\n\t * @return {Object} The modified object.\n\t */\n\textendProps: function(tgt, subject /* FIXME: use spread operator */) {\n\t\tfunction internalExtend(target, obj) {\n\t\t\tvar allKeys = [].concat(Object.keys(target), Object.keys(obj));\n\t\t\tfor (var i = 0, numKeys = allKeys.length; i < numKeys; i++) {\n\t\t\t\tvar k = allKeys[i];\n\t\t\t\tif (target[k] === undefined || target[k] === null) {\n\t\t\t\t\ttarget[k] = obj[k];\n\t\t\t\t}\n\t\t\t}\n\t\t\treturn target;\n\t\t}\n\t\tvar n = arguments.length;\n\t\tfor (var j = 1; j < n; j++) {\n\t\t\tinternalExtend(tgt, arguments[j]);\n\t\t}\n\t\treturn tgt;\n\t},\n\n\tstripParsoidIdPrefix: function(aboutId) {\n\t\t// 'mwt' is the prefix used for new ids in mediawiki.parser.environment#newObjectId\n\t\treturn aboutId.replace(/^#?mwt/, '');\n\t},\n\n\tisParsoidObjectId: function(aboutId) {\n\t\t// 'mwt' is the prefix used for new ids in mediawiki.parser.environment#newObjectId\n\t\treturn aboutId.match(/^#mwt/);\n\t},\n\n\t/**\n\t * Determine if the named tag is void (can not have content).\n\t *\n\t * @param name\n\t */\n\tisVoidElement: function(name) {\n\t\treturn Consts.HTML.VoidTags.has(name.toUpperCase());\n\t},\n\n\t// deep clones by default.\n\tclone: function(obj, deepClone) {\n\t\tif (deepClone === undefined) {\n\t\t\tdeepClone = true;\n\t\t}\n\t\tif (Array.isArray(obj)) {\n\t\t\tif (deepClone) {\n\t\t\t\treturn obj.map(function(el) {\n\t\t\t\t\treturn Util.clone(el, true);\n\t\t\t\t});\n\t\t\t} else {\n\t\t\t\treturn obj.slice();\n\t\t\t}\n\t\t} else if (obj instanceof Object && // only \"plain objects\"\n\t\t\t\t\tObject.getPrototypeOf(obj) === Object.prototype) {\n\t\t\t/* This definition of \"plain object\" comes from jquery,\n\t\t\t * via zepto.js.  But this is really a big hack; we should\n\t\t\t * probably put a console.assert() here and more precisely\n\t\t\t * delimit what we think is legit to clone. (Hint: not\n\t\t\t * DOM trees.) */\n\t\t\tif (deepClone) {\n\t\t\t\treturn Object.keys(obj).reduce(function(nobj, key) {\n\t\t\t\t\tnobj[key] = Util.clone(obj[key], true);\n\t\t\t\t\treturn nobj;\n\t\t\t\t}, {});\n\t\t\t} else {\n\t\t\t\treturn Object.assign({}, obj);\n\t\t\t}\n\t\t} else {\n\t\t\treturn obj;\n\t\t}\n\t},\n\n\t// Just a copy `Util.clone` used in *testing* to reverse the effects of\n\t// freezing an object.  Works with more that just \"plain objects\"\n\tunFreeze: function(obj, deepClone) {\n\t\tif (deepClone === undefined) {\n\t\t\tdeepClone = true;\n\t\t}\n\t\tif (Array.isArray(obj)) {\n\t\t\tif (deepClone) {\n\t\t\t\treturn obj.map(function(el) {\n\t\t\t\t\treturn Util.unFreeze(el, true);\n\t\t\t\t});\n\t\t\t} else {\n\t\t\t\treturn obj.slice();\n\t\t\t}\n\t\t} else if (obj instanceof Object) {\n\t\t\tif (deepClone) {\n\t\t\t\treturn Object.keys(obj).reduce(function(nobj, key) {\n\t\t\t\t\tnobj[key] = Util.unFreeze(obj[key], true);\n\t\t\t\t\treturn nobj;\n\t\t\t\t}, new obj.constructor());\n\t\t\t} else {\n\t\t\t\treturn Object.assign({}, obj);\n\t\t\t}\n\t\t} else {\n\t\t\treturn obj;\n\t\t}\n\t},\n\n\t/**\n\t * Extract the last *unicode* character of the string.\n\t * This might be more than one javascript character, if the\n\t * last character is a martian.\n\t *\n\t * @param str\n\t * @param idx\n\t * @param str\n\t * @param idx\n\t */\n\tlastUniChar: function(str, idx) {\n\t\tif (idx === undefined) { idx = str.length; }\n\t\tif (idx <= 0 || idx > str.length) { return ''; }\n\t\tlet s = str[--idx];\n\t\tif (/[\\uDC00-\\uDFFF]/.test(s)) {\n\t\t\ts = str[--idx] + s;\n\t\t}\n\t\treturn s;\n\t},\n\n\t// Arguably we shouldn't be using this; see:\n\t// https://phabricator.wikimedia.org/T238022#5665580\n\tisUniWord: function(c) {\n\t\ttry {\n\t\t\t// Have to hide this regexp from eslint (!)\n\t\t\treturn (new RegExp(\"^[\\\\p{L}\\\\p{N}_]\", \"u\")).test(c);\n\t\t} catch (e) { /* oh, well, we have to do this the hard way */ }\n\t\t// Courtesy of https://mothereff.in/regexpu for the above\n\t\treturn /^[0-9A-Z_a-z\\xAA\\xB2\\xB3\\xB5\\xB9\\xBA\\xBC-\\xBE\\xC0-\\xD6\\xD8-\\xF6\\xF8-\\u02C1\\u02C6-\\u02D1\\u02E0-\\u02E4\\u02EC\\u02EE\\u0370-\\u0374\\u0376\\u0377\\u037A-\\u037D\\u037F\\u0386\\u0388-\\u038A\\u038C\\u038E-\\u03A1\\u03A3-\\u03F5\\u03F7-\\u0481\\u048A-\\u052F\\u0531-\\u0556\\u0559\\u0560-\\u0588\\u05D0-\\u05EA\\u05EF-\\u05F2\\u0620-\\u064A\\u0660-\\u0669\\u066E\\u066F\\u0671-\\u06D3\\u06D5\\u06E5\\u06E6\\u06EE-\\u06FC\\u06FF\\u0710\\u0712-\\u072F\\u074D-\\u07A5\\u07B1\\u07C0-\\u07EA\\u07F4\\u07F5\\u07FA\\u0800-\\u0815\\u081A\\u0824\\u0828\\u0840-\\u0858\\u0860-\\u086A\\u08A0-\\u08B4\\u08B6-\\u08BD\\u0904-\\u0939\\u093D\\u0950\\u0958-\\u0961\\u0966-\\u096F\\u0971-\\u0980\\u0985-\\u098C\\u098F\\u0990\\u0993-\\u09A8\\u09AA-\\u09B0\\u09B2\\u09B6-\\u09B9\\u09BD\\u09CE\\u09DC\\u09DD\\u09DF-\\u09E1\\u09E6-\\u09F1\\u09F4-\\u09F9\\u09FC\\u0A05-\\u0A0A\\u0A0F\\u0A10\\u0A13-\\u0A28\\u0A2A-\\u0A30\\u0A32\\u0A33\\u0A35\\u0A36\\u0A38\\u0A39\\u0A59-\\u0A5C\\u0A5E\\u0A66-\\u0A6F\\u0A72-\\u0A74\\u0A85-\\u0A8D\\u0A8F-\\u0A91\\u0A93-\\u0AA8\\u0AAA-\\u0AB0\\u0AB2\\u0AB3\\u0AB5-\\u0AB9\\u0ABD\\u0AD0\\u0AE0\\u0AE1\\u0AE6-\\u0AEF\\u0AF9\\u0B05-\\u0B0C\\u0B0F\\u0B10\\u0B13-\\u0B28\\u0B2A-\\u0B30\\u0B32\\u0B33\\u0B35-\\u0B39\\u0B3D\\u0B5C\\u0B5D\\u0B5F-\\u0B61\\u0B66-\\u0B6F\\u0B71-\\u0B77\\u0B83\\u0B85-\\u0B8A\\u0B8E-\\u0B90\\u0B92-\\u0B95\\u0B99\\u0B9A\\u0B9C\\u0B9E\\u0B9F\\u0BA3\\u0BA4\\u0BA8-\\u0BAA\\u0BAE-\\u0BB9\\u0BD0\\u0BE6-\\u0BF2\\u0C05-\\u0C0C\\u0C0E-\\u0C10\\u0C12-\\u0C28\\u0C2A-\\u0C39\\u0C3D\\u0C58-\\u0C5A\\u0C60\\u0C61\\u0C66-\\u0C6F\\u0C78-\\u0C7E\\u0C80\\u0C85-\\u0C8C\\u0C8E-\\u0C90\\u0C92-\\u0CA8\\u0CAA-\\u0CB3\\u0CB5-\\u0CB9\\u0CBD\\u0CDE\\u0CE0\\u0CE1\\u0CE6-\\u0CEF\\u0CF1\\u0CF2\\u0D05-\\u0D0C\\u0D0E-\\u0D10\\u0D12-\\u0D3A\\u0D3D\\u0D4E\\u0D54-\\u0D56\\u0D58-\\u0D61\\u0D66-\\u0D78\\u0D7A-\\u0D7F\\u0D85-\\u0D96\\u0D9A-\\u0DB1\\u0DB3-\\u0DBB\\u0DBD\\u0DC0-\\u0DC6\\u0DE6-\\u0DEF\\u0E01-\\u0E30\\u0E32\\u0E33\\u0E40-\\u0E46\\u0E50-\\u0E59\\u0E81\\u0E82\\u0E84\\u0E86-\\u0E8A\\u0E8C-\\u0EA3\\u0EA5\\u0EA7-\\u0EB0\\u0EB2\\u0EB3\\u0EBD\\u0EC0-\\u0EC4\\u0EC6\\u0ED0-\\u0ED9\\u0EDC-\\u0EDF\\u0F00\\u0F20-\\u0F33\\u0F40-\\u0F47\\u0F49-\\u0F6C\\u0F88-\\u0F8C\\u1000-\\u102A\\u103F-\\u1049\\u1050-\\u1055\\u105A-\\u105D\\u1061\\u1065\\u1066\\u106E-\\u1070\\u1075-\\u1081\\u108E\\u1090-\\u1099\\u10A0-\\u10C5\\u10C7\\u10CD\\u10D0-\\u10FA\\u10FC-\\u1248\\u124A-\\u124D\\u1250-\\u1256\\u1258\\u125A-\\u125D\\u1260-\\u1288\\u128A-\\u128D\\u1290-\\u12B0\\u12B2-\\u12B5\\u12B8-\\u12BE\\u12C0\\u12C2-\\u12C5\\u12C8-\\u12D6\\u12D8-\\u1310\\u1312-\\u1315\\u1318-\\u135A\\u1369-\\u137C\\u1380-\\u138F\\u13A0-\\u13F5\\u13F8-\\u13FD\\u1401-\\u166C\\u166F-\\u167F\\u1681-\\u169A\\u16A0-\\u16EA\\u16EE-\\u16F8\\u1700-\\u170C\\u170E-\\u1711\\u1720-\\u1731\\u1740-\\u1751\\u1760-\\u176C\\u176E-\\u1770\\u1780-\\u17B3\\u17D7\\u17DC\\u17E0-\\u17E9\\u17F0-\\u17F9\\u1810-\\u1819\\u1820-\\u1878\\u1880-\\u1884\\u1887-\\u18A8\\u18AA\\u18B0-\\u18F5\\u1900-\\u191E\\u1946-\\u196D\\u1970-\\u1974\\u1980-\\u19AB\\u19B0-\\u19C9\\u19D0-\\u19DA\\u1A00-\\u1A16\\u1A20-\\u1A54\\u1A80-\\u1A89\\u1A90-\\u1A99\\u1AA7\\u1B05-\\u1B33\\u1B45-\\u1B4B\\u1B50-\\u1B59\\u1B83-\\u1BA0\\u1BAE-\\u1BE5\\u1C00-\\u1C23\\u1C40-\\u1C49\\u1C4D-\\u1C7D\\u1C80-\\u1C88\\u1C90-\\u1CBA\\u1CBD-\\u1CBF\\u1CE9-\\u1CEC\\u1CEE-\\u1CF3\\u1CF5\\u1CF6\\u1CFA\\u1D00-\\u1DBF\\u1E00-\\u1F15\\u1F18-\\u1F1D\\u1F20-\\u1F45\\u1F48-\\u1F4D\\u1F50-\\u1F57\\u1F59\\u1F5B\\u1F5D\\u1F5F-\\u1F7D\\u1F80-\\u1FB4\\u1FB6-\\u1FBC\\u1FBE\\u1FC2-\\u1FC4\\u1FC6-\\u1FCC\\u1FD0-\\u1FD3\\u1FD6-\\u1FDB\\u1FE0-\\u1FEC\\u1FF2-\\u1FF4\\u1FF6-\\u1FFC\\u2070\\u2071\\u2074-\\u2079\\u207F-\\u2089\\u2090-\\u209C\\u2102\\u2107\\u210A-\\u2113\\u2115\\u2119-\\u211D\\u2124\\u2126\\u2128\\u212A-\\u212D\\u212F-\\u2139\\u213C-\\u213F\\u2145-\\u2149\\u214E\\u2150-\\u2189\\u2460-\\u249B\\u24EA-\\u24FF\\u2776-\\u2793\\u2C00-\\u2C2E\\u2C30-\\u2C5E\\u2C60-\\u2CE4\\u2CEB-\\u2CEE\\u2CF2\\u2CF3\\u2CFD\\u2D00-\\u2D25\\u2D27\\u2D2D\\u2D30-\\u2D67\\u2D6F\\u2D80-\\u2D96\\u2DA0-\\u2DA6\\u2DA8-\\u2DAE\\u2DB0-\\u2DB6\\u2DB8-\\u2DBE\\u2DC0-\\u2DC6\\u2DC8-\\u2DCE\\u2DD0-\\u2DD6\\u2DD8-\\u2DDE\\u2E2F\\u3005-\\u3007\\u3021-\\u3029\\u3031-\\u3035\\u3038-\\u303C\\u3041-\\u3096\\u309D-\\u309F\\u30A1-\\u30FA\\u30FC-\\u30FF\\u3105-\\u312F\\u3131-\\u318E\\u3192-\\u3195\\u31A0-\\u31BA\\u31F0-\\u31FF\\u3220-\\u3229\\u3248-\\u324F\\u3251-\\u325F\\u3280-\\u3289\\u32B1-\\u32BF\\u3400-\\u4DB5\\u4E00-\\u9FEF\\uA000-\\uA48C\\uA4D0-\\uA4FD\\uA500-\\uA60C\\uA610-\\uA62B\\uA640-\\uA66E\\uA67F-\\uA69D\\uA6A0-\\uA6EF\\uA717-\\uA71F\\uA722-\\uA788\\uA78B-\\uA7BF\\uA7C2-\\uA7C6\\uA7F7-\\uA801\\uA803-\\uA805\\uA807-\\uA80A\\uA80C-\\uA822\\uA830-\\uA835\\uA840-\\uA873\\uA882-\\uA8B3\\uA8D0-\\uA8D9\\uA8F2-\\uA8F7\\uA8FB\\uA8FD\\uA8FE\\uA900-\\uA925\\uA930-\\uA946\\uA960-\\uA97C\\uA984-\\uA9B2\\uA9CF-\\uA9D9\\uA9E0-\\uA9E4\\uA9E6-\\uA9FE\\uAA00-\\uAA28\\uAA40-\\uAA42\\uAA44-\\uAA4B\\uAA50-\\uAA59\\uAA60-\\uAA76\\uAA7A\\uAA7E-\\uAAAF\\uAAB1\\uAAB5\\uAAB6\\uAAB9-\\uAABD\\uAAC0\\uAAC2\\uAADB-\\uAADD\\uAAE0-\\uAAEA\\uAAF2-\\uAAF4\\uAB01-\\uAB06\\uAB09-\\uAB0E\\uAB11-\\uAB16\\uAB20-\\uAB26\\uAB28-\\uAB2E\\uAB30-\\uAB5A\\uAB5C-\\uAB67\\uAB70-\\uABE2\\uABF0-\\uABF9\\uAC00-\\uD7A3\\uD7B0-\\uD7C6\\uD7CB-\\uD7FB\\uF900-\\uFA6D\\uFA70-\\uFAD9\\uFB00-\\uFB06\\uFB13-\\uFB17\\uFB1D\\uFB1F-\\uFB28\\uFB2A-\\uFB36\\uFB38-\\uFB3C\\uFB3E\\uFB40\\uFB41\\uFB43\\uFB44\\uFB46-\\uFBB1\\uFBD3-\\uFD3D\\uFD50-\\uFD8F\\uFD92-\\uFDC7\\uFDF0-\\uFDFB\\uFE70-\\uFE74\\uFE76-\\uFEFC\\uFF10-\\uFF19\\uFF21-\\uFF3A\\uFF41-\\uFF5A\\uFF66-\\uFFBE\\uFFC2-\\uFFC7\\uFFCA-\\uFFCF\\uFFD2-\\uFFD7\\uFFDA-\\uFFDC\\u{10000}-\\u{1000B}\\u{1000D}-\\u{10026}\\u{10028}-\\u{1003A}\\u{1003C}\\u{1003D}\\u{1003F}-\\u{1004D}\\u{10050}-\\u{1005D}\\u{10080}-\\u{100FA}\\u{10107}-\\u{10133}\\u{10140}-\\u{10178}\\u{1018A}\\u{1018B}\\u{10280}-\\u{1029C}\\u{102A0}-\\u{102D0}\\u{102E1}-\\u{102FB}\\u{10300}-\\u{10323}\\u{1032D}-\\u{1034A}\\u{10350}-\\u{10375}\\u{10380}-\\u{1039D}\\u{103A0}-\\u{103C3}\\u{103C8}-\\u{103CF}\\u{103D1}-\\u{103D5}\\u{10400}-\\u{1049D}\\u{104A0}-\\u{104A9}\\u{104B0}-\\u{104D3}\\u{104D8}-\\u{104FB}\\u{10500}-\\u{10527}\\u{10530}-\\u{10563}\\u{10600}-\\u{10736}\\u{10740}-\\u{10755}\\u{10760}-\\u{10767}\\u{10800}-\\u{10805}\\u{10808}\\u{1080A}-\\u{10835}\\u{10837}\\u{10838}\\u{1083C}\\u{1083F}-\\u{10855}\\u{10858}-\\u{10876}\\u{10879}-\\u{1089E}\\u{108A7}-\\u{108AF}\\u{108E0}-\\u{108F2}\\u{108F4}\\u{108F5}\\u{108FB}-\\u{1091B}\\u{10920}-\\u{10939}\\u{10980}-\\u{109B7}\\u{109BC}-\\u{109CF}\\u{109D2}-\\u{10A00}\\u{10A10}-\\u{10A13}\\u{10A15}-\\u{10A17}\\u{10A19}-\\u{10A35}\\u{10A40}-\\u{10A48}\\u{10A60}-\\u{10A7E}\\u{10A80}-\\u{10A9F}\\u{10AC0}-\\u{10AC7}\\u{10AC9}-\\u{10AE4}\\u{10AEB}-\\u{10AEF}\\u{10B00}-\\u{10B35}\\u{10B40}-\\u{10B55}\\u{10B58}-\\u{10B72}\\u{10B78}-\\u{10B91}\\u{10BA9}-\\u{10BAF}\\u{10C00}-\\u{10C48}\\u{10C80}-\\u{10CB2}\\u{10CC0}-\\u{10CF2}\\u{10CFA}-\\u{10D23}\\u{10D30}-\\u{10D39}\\u{10E60}-\\u{10E7E}\\u{10F00}-\\u{10F27}\\u{10F30}-\\u{10F45}\\u{10F51}-\\u{10F54}\\u{10FE0}-\\u{10FF6}\\u{11003}-\\u{11037}\\u{11052}-\\u{1106F}\\u{11083}-\\u{110AF}\\u{110D0}-\\u{110E8}\\u{110F0}-\\u{110F9}\\u{11103}-\\u{11126}\\u{11136}-\\u{1113F}\\u{11144}\\u{11150}-\\u{11172}\\u{11176}\\u{11183}-\\u{111B2}\\u{111C1}-\\u{111C4}\\u{111D0}-\\u{111DA}\\u{111DC}\\u{111E1}-\\u{111F4}\\u{11200}-\\u{11211}\\u{11213}-\\u{1122B}\\u{11280}-\\u{11286}\\u{11288}\\u{1128A}-\\u{1128D}\\u{1128F}-\\u{1129D}\\u{1129F}-\\u{112A8}\\u{112B0}-\\u{112DE}\\u{112F0}-\\u{112F9}\\u{11305}-\\u{1130C}\\u{1130F}\\u{11310}\\u{11313}-\\u{11328}\\u{1132A}-\\u{11330}\\u{11332}\\u{11333}\\u{11335}-\\u{11339}\\u{1133D}\\u{11350}\\u{1135D}-\\u{11361}\\u{11400}-\\u{11434}\\u{11447}-\\u{1144A}\\u{11450}-\\u{11459}\\u{1145F}\\u{11480}-\\u{114AF}\\u{114C4}\\u{114C5}\\u{114C7}\\u{114D0}-\\u{114D9}\\u{11580}-\\u{115AE}\\u{115D8}-\\u{115DB}\\u{11600}-\\u{1162F}\\u{11644}\\u{11650}-\\u{11659}\\u{11680}-\\u{116AA}\\u{116B8}\\u{116C0}-\\u{116C9}\\u{11700}-\\u{1171A}\\u{11730}-\\u{1173B}\\u{11800}-\\u{1182B}\\u{118A0}-\\u{118F2}\\u{118FF}\\u{119A0}-\\u{119A7}\\u{119AA}-\\u{119D0}\\u{119E1}\\u{119E3}\\u{11A00}\\u{11A0B}-\\u{11A32}\\u{11A3A}\\u{11A50}\\u{11A5C}-\\u{11A89}\\u{11A9D}\\u{11AC0}-\\u{11AF8}\\u{11C00}-\\u{11C08}\\u{11C0A}-\\u{11C2E}\\u{11C40}\\u{11C50}-\\u{11C6C}\\u{11C72}-\\u{11C8F}\\u{11D00}-\\u{11D06}\\u{11D08}\\u{11D09}\\u{11D0B}-\\u{11D30}\\u{11D46}\\u{11D50}-\\u{11D59}\\u{11D60}-\\u{11D65}\\u{11D67}\\u{11D68}\\u{11D6A}-\\u{11D89}\\u{11D98}\\u{11DA0}-\\u{11DA9}\\u{11EE0}-\\u{11EF2}\\u{11FC0}-\\u{11FD4}\\u{12000}-\\u{12399}\\u{12400}-\\u{1246E}\\u{12480}-\\u{12543}\\u{13000}-\\u{1342E}\\u{14400}-\\u{14646}\\u{16800}-\\u{16A38}\\u{16A40}-\\u{16A5E}\\u{16A60}-\\u{16A69}\\u{16AD0}-\\u{16AED}\\u{16B00}-\\u{16B2F}\\u{16B40}-\\u{16B43}\\u{16B50}-\\u{16B59}\\u{16B5B}-\\u{16B61}\\u{16B63}-\\u{16B77}\\u{16B7D}-\\u{16B8F}\\u{16E40}-\\u{16E96}\\u{16F00}-\\u{16F4A}\\u{16F50}\\u{16F93}-\\u{16F9F}\\u{16FE0}\\u{16FE1}\\u{16FE3}\\u{17000}-\\u{187F7}\\u{18800}-\\u{18AF2}\\u{1B000}-\\u{1B11E}\\u{1B150}-\\u{1B152}\\u{1B164}-\\u{1B167}\\u{1B170}-\\u{1B2FB}\\u{1BC00}-\\u{1BC6A}\\u{1BC70}-\\u{1BC7C}\\u{1BC80}-\\u{1BC88}\\u{1BC90}-\\u{1BC99}\\u{1D2E0}-\\u{1D2F3}\\u{1D360}-\\u{1D378}\\u{1D400}-\\u{1D454}\\u{1D456}-\\u{1D49C}\\u{1D49E}\\u{1D49F}\\u{1D4A2}\\u{1D4A5}\\u{1D4A6}\\u{1D4A9}-\\u{1D4AC}\\u{1D4AE}-\\u{1D4B9}\\u{1D4BB}\\u{1D4BD}-\\u{1D4C3}\\u{1D4C5}-\\u{1D505}\\u{1D507}-\\u{1D50A}\\u{1D50D}-\\u{1D514}\\u{1D516}-\\u{1D51C}\\u{1D51E}-\\u{1D539}\\u{1D53B}-\\u{1D53E}\\u{1D540}-\\u{1D544}\\u{1D546}\\u{1D54A}-\\u{1D550}\\u{1D552}-\\u{1D6A5}\\u{1D6A8}-\\u{1D6C0}\\u{1D6C2}-\\u{1D6DA}\\u{1D6DC}-\\u{1D6FA}\\u{1D6FC}-\\u{1D714}\\u{1D716}-\\u{1D734}\\u{1D736}-\\u{1D74E}\\u{1D750}-\\u{1D76E}\\u{1D770}-\\u{1D788}\\u{1D78A}-\\u{1D7A8}\\u{1D7AA}-\\u{1D7C2}\\u{1D7C4}-\\u{1D7CB}\\u{1D7CE}-\\u{1D7FF}\\u{1E100}-\\u{1E12C}\\u{1E137}-\\u{1E13D}\\u{1E140}-\\u{1E149}\\u{1E14E}\\u{1E2C0}-\\u{1E2EB}\\u{1E2F0}-\\u{1E2F9}\\u{1E800}-\\u{1E8C4}\\u{1E8C7}-\\u{1E8CF}\\u{1E900}-\\u{1E943}\\u{1E94B}\\u{1E950}-\\u{1E959}\\u{1EC71}-\\u{1ECAB}\\u{1ECAD}-\\u{1ECAF}\\u{1ECB1}-\\u{1ECB4}\\u{1ED01}-\\u{1ED2D}\\u{1ED2F}-\\u{1ED3D}\\u{1EE00}-\\u{1EE03}\\u{1EE05}-\\u{1EE1F}\\u{1EE21}\\u{1EE22}\\u{1EE24}\\u{1EE27}\\u{1EE29}-\\u{1EE32}\\u{1EE34}-\\u{1EE37}\\u{1EE39}\\u{1EE3B}\\u{1EE42}\\u{1EE47}\\u{1EE49}\\u{1EE4B}\\u{1EE4D}-\\u{1EE4F}\\u{1EE51}\\u{1EE52}\\u{1EE54}\\u{1EE57}\\u{1EE59}\\u{1EE5B}\\u{1EE5D}\\u{1EE5F}\\u{1EE61}\\u{1EE62}\\u{1EE64}\\u{1EE67}-\\u{1EE6A}\\u{1EE6C}-\\u{1EE72}\\u{1EE74}-\\u{1EE77}\\u{1EE79}-\\u{1EE7C}\\u{1EE7E}\\u{1EE80}-\\u{1EE89}\\u{1EE8B}-\\u{1EE9B}\\u{1EEA1}-\\u{1EEA3}\\u{1EEA5}-\\u{1EEA9}\\u{1EEAB}-\\u{1EEBB}\\u{1F100}-\\u{1F10C}\\u{20000}-\\u{2A6D6}\\u{2A700}-\\u{2B734}\\u{2B740}-\\u{2B81D}\\u{2B820}-\\u{2CEA1}\\u{2CEB0}-\\u{2EBE0}\\u{2F800}-\\u{2FA1D}]/u.test(c);\n\t},\n\n\t/**\n\t * Emulate PHP's trim, which is almost-but-not-quite like JS's trim.\n\t *\n\t * PHP: https://www.php.net/manual/en/function.trim.php\n\t *\n\t * JS: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/trim\n\t *\n\t * @param str\n\t */\n\tphpTrim: function(str) {\n\t\treturn str.replace(/(?:^[ \\t\\n\\r\\0\\x0B]+)|(?:[ \\t\\n\\r\\0\\x0B]+$)/g, '');\n\t},\n\n\t/**\n\t * Emulate PHP's urlencode by patching results of\n\t * JS's `encodeURIComponent`.\n\t *\n\t * PHP: https://secure.php.net/manual/en/function.urlencode.php\n\t *\n\t * JS:  https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/encodeURIComponent\n\t *\n\t * Spaces to '+' is a PHP peculiarity as well.\n\t *\n\t * @param txt\n\t */\n\tphpURLEncode: function(txt) {\n\t\treturn encodeURIComponent(txt)\n\t\t\t.replace(/!/g, '%21')\n\t\t\t.replace(/'/g, '%27')\n\t\t\t.replace(/\\(/g, '%28')\n\t\t\t.replace(/\\)/g, '%29')\n\t\t\t.replace(/\\*/g, '%2A')\n\t\t\t.replace(/~/g, '%7E')\n\t\t\t.replace(/%20/g, '+');\n\t},\n\n\t/*\n\t * Wraps `decodeURI` in a try/catch to suppress throws from malformed URI\n\t * sequences.  Distinct from `decodeURIComponent` in that certain\n\t * sequences aren't decoded if they result in (un)reserved characters.\n\t */\n\tdecodeURI: function(s) {\n\t\t// Most of the time we should have valid input\n\t\ttry {\n\t\t\treturn decodeURI(s);\n\t\t} catch (e) {\n\t\t\t// Fall through\n\t\t}\n\n\t\t// Extract each encoded character and decode it individually\n\t\treturn s.replace(\n\t\t\t/%[0-7][0-9A-F]|%[CD][0-9A-F]%[89AB][0-9A-F]|%E[0-9A-F](?:%[89AB][0-9A-F]){2}|%F[0-4](?:%[89AB][0-9A-F]){3}/gi,\n\t\t\tfunction(m) {\n\t\t\t\ttry {\n\t\t\t\t\treturn decodeURI(m);\n\t\t\t\t} catch (e) {\n\t\t\t\t\treturn m;\n\t\t\t\t}\n\t\t\t}\n\t\t);\n\t},\n\n\t/*\n\t * Wraps `decodeURIComponent` in a try/catch to suppress throws from\n\t * malformed URI sequences.\n\t */\n\tdecodeURIComponent: function(s) {\n\t\t// Most of the time we should have valid input\n\t\ttry {\n\t\t\treturn decodeURIComponent(s);\n\t\t} catch (e) {\n\t\t\t// Fall through\n\t\t}\n\n\t\t// Extract each encoded character and decode it individually\n\t\treturn s.replace(\n\t\t\t/%[0-7][0-9A-F]|%[CD][0-9A-F]%[89AB][0-9A-F]|%E[0-9A-F](?:%[89AB][0-9A-F]){2}|%F[0-4](?:%[89AB][0-9A-F]){3}/gi,\n\t\t\tfunction(m) {\n\t\t\t\ttry {\n\t\t\t\t\treturn decodeURIComponent(m);\n\t\t\t\t} catch (e) {\n\t\t\t\t\treturn m;\n\t\t\t\t}\n\t\t\t}\n\t\t);\n\t},\n\n\textractExtBody: function(token) {\n\t\tvar extSrc = token.getAttribute('source');\n\t\tvar extTagOffsets = token.dataAttribs.extTagOffsets;\n\t\treturn extSrc.slice(extTagOffsets[2], -extTagOffsets[3]);\n\t},\n\n\tisValidDSR: function(dsr, all) {\n\t\tconst isValidOffset = n => typeof (n) === 'number' && n >= 0;\n\t\treturn dsr &&\n\t\t\tisValidOffset(dsr[0]) && isValidOffset(dsr[1]) &&\n\t\t\t(!all || (isValidOffset(dsr[2]) && isValidOffset(dsr[3])));\n\t},\n\n\t/**\n\t * Quickly hash an array or string.\n\t *\n\t * @param {Array|string} arr\n\t */\n\tmakeHash: function(arr) {\n\t\tvar md5 = crypto.createHash('MD5');\n\t\tvar i;\n\t\tif (Array.isArray(arr)) {\n\t\t\tfor (i = 0; i < arr.length; i++) {\n\t\t\t\tif (arr[i] instanceof String) {\n\t\t\t\t\tmd5.update(arr[i]);\n\t\t\t\t} else {\n\t\t\t\t\tmd5.update(arr[i].toString());\n\t\t\t\t}\n\t\t\t\tmd5.update(\"\\0\");\n\t\t\t}\n\t\t} else {\n\t\t\tmd5.update(arr);\n\t\t}\n\t\treturn md5.digest('hex');\n\t},\n\n\t/**\n\t * Cannonicalizes a namespace name.\n\t *\n\t * Used by {@link WikiConfig}.\n\t *\n\t * @param {string} name Non-normalized namespace name.\n\t * @return {string}\n\t */\n\tnormalizeNamespaceName: function(name) {\n\t\treturn name.toLowerCase().replace(' ', '_');\n\t},\n\n\t/**\n\t * Compare two titles for equality.\n\t *\n\t * @param {Title} t1\n\t * @param {Title} t2\n\t * @return {boolean}\n\t */\n\ttitleEquals: function(t1, t2) {\n\t\t// See: https://github.com/wikimedia/mediawiki-title/pull/43\n\t\treturn (t1 === t2) || (\n\t\t\tt1 !== null && t2 !== null && t1.getKey() === t2.getKey() &&\n\t\t\tUtil.namespaceEquals(t1.getNamespace(), t2.getNamespace())\n\t\t);\n\t},\n\n\t/**\n\t * Compare two namespaces for equality.\n\t *\n\t * @param {Namespace} n1\n\t * @param {Namespace} n2\n\t * @return {boolean}\n\t */\n\tnamespaceEquals: function(n1, n2) {\n\t\t// We shouldn't have to access the private _id field of namespace :(\n\t\t// See: https://github.com/wikimedia/mediawiki-title/pull/43\n\t\treturn (n1 === n2) || (n1 !== null && n2 !== null && n1._id === n2._id);\n\t},\n\n\t/**\n\t * Decode HTML5 entities in wikitext.\n\t *\n\t * NOTE that wikitext only allows semicolon-terminated entities, while\n\t * HTML allows a number of \"legacy\" entities to be decoded without\n\t * a terminating semicolon.  This function deliberately does not\n\t * decode these HTML-only entity forms.\n\t *\n\t * @param {string} text\n\t * @return {string}\n\t */\n\tdecodeWtEntities: function(text) {\n\t\t// HTML5 allows semicolon-less entities which wikitext does not:\n\t\t// in wikitext all entities must end in a semicolon.\n\t\treturn text.replace(\n\t\t\t/&[#0-9a-zA-Z]+;/g,\n\t\t\t(match) => {\n\t\t\t\t// Be careful: `&ampamp;` can get through the above, which\n\t\t\t\t// decodeHTML5 will decode to `&amp;` -- but that's a sneaky\n\t\t\t\t// semicolon-less entity!\n\t\t\t\tconst m = /^&#(?:x([A-Fa-f0-9]+)|(\\d+));$/.exec(match);\n\t\t\t\tlet c, cp;\n\t\t\t\tif (m) {\n\t\t\t\t\t// entities contains a bunch of weird legacy mappings\n\t\t\t\t\t// for numeric codepoints (T113194) which we don't want.\n\t\t\t\t\tif (m[1]) {\n\t\t\t\t\t\tcp = Number.parseInt(m[1], 16);\n\t\t\t\t\t} else {\n\t\t\t\t\t\tcp = Number.parseInt(m[2], 10);\n\t\t\t\t\t}\n\t\t\t\t\tif (cp > 0x10FFFF) {\n\t\t\t\t\t\t// Invalid entity, don't give to String.fromCodePoint\n\t\t\t\t\t\treturn match;\n\t\t\t\t\t}\n\t\t\t\t\tc = String.fromCodePoint(cp);\n\t\t\t\t} else {\n\t\t\t\t\tc = entities.decodeHTML5(match);\n\t\t\t\t\t// Length can be legit greater than one if it is astral\n\t\t\t\t\tif (c.length > 1 && c.endsWith(';')) {\n\t\t\t\t\t\t// Invalid entity!\n\t\t\t\t\t\treturn match;\n\t\t\t\t\t}\n\t\t\t\t\tcp = c.codePointAt(0);\n\t\t\t\t}\n\t\t\t\t// Check other banned codepoints (T106578)\n\t\t\t\tif (\n\t\t\t\t\t(cp < 0x09) ||\n\t\t\t\t\t(cp > 0x0A && cp < 0x20) ||\n\t\t\t\t\t(cp > 0x7E && cp < 0xA0) ||\n\t\t\t\t\t(cp > 0xD7FF && cp < 0xE000) ||\n\t\t\t\t\t(cp > 0xFFFD && cp < 0x10000) ||\n\t\t\t\t\t(cp > 0x10FFFF)\n\t\t\t\t) {\n\t\t\t\t\t// Invalid entity!\n\t\t\t\t\treturn match;\n\t\t\t\t}\n\t\t\t\treturn c;\n\t\t\t}\n\t\t);\n\t},\n\n\t/**\n\t * Entity-escape anything that would decode to a valid wikitext entity.\n\t *\n\t * Note that HTML5 allows certain \"semicolon-less\" entities, like\n\t * `&para`; these aren't allowed in wikitext and won't be escaped\n\t * by this function.\n\t *\n\t * @param {string} text\n\t * @return {string}\n\t */\n\tescapeWtEntities: function(text) {\n\t\t// [CSA] replace with entities.encode( text, 2 )?\n\t\t// but that would encode *all* ampersands, where we apparently just want\n\t\t// to encode ampersands that precede valid entities.\n\t\treturn text.replace(/&[#0-9a-zA-Z]+;/g, function(match) {\n\t\t\tvar decodedChar = Util.decodeWtEntities(match);\n\t\t\tif (decodedChar !== match) {\n\t\t\t\t// Escape the ampersand\n\t\t\t\treturn '&amp;' + match.substr(1);\n\t\t\t} else {\n\t\t\t\t// Not an entity, just return the string\n\t\t\t\treturn match;\n\t\t\t}\n\t\t});\n\t},\n\n\tescapeHtml: function(s) {\n\t\treturn s.replace(/[\"'&<>]/g, entities.encodeHTML5);\n\t},\n\n\t/**\n\t * Encode all characters as entity references.  This is done to make\n\t * characters safe for wikitext (regardless of whether they are\n\t * HTML-safe).\n\t *\n\t * @param {string} s\n\t * @return {string}\n\t */\n\tentityEncodeAll: function(s) {\n\t\t// this is surrogate-aware\n\t\treturn Array.from(s).map(function(c) {\n\t\t\tc = c.codePointAt(0).toString(16).toUpperCase();\n\t\t\tif (c.length === 1) { c = '0' + c; } // convention\n\t\t\tif (c === 'A0') { return '&nbsp;'; } // special-case common usage\n\t\t\treturn '&#x' + c + ';';\n\t\t}).join('');\n\t},\n\n\t/**\n\t * Determine whether the protocol of a link is potentially valid. Use the\n\t * environment's per-wiki config to do so.\n\t *\n\t * @param linkTarget\n\t * @param env\n\t */\n\tisProtocolValid: function(linkTarget, env) {\n\t\tvar wikiConf = env.conf.wiki;\n\t\tif (typeof linkTarget === 'string') {\n\t\t\treturn wikiConf.hasValidProtocol(linkTarget);\n\t\t} else {\n\t\t\treturn true;\n\t\t}\n\t},\n\n\tparseMediaDimensions: function(str, onlyOne) {\n\t\tvar dimensions = null;\n\t\tvar match = str.match(/^(\\d*)(?:x(\\d+))?\\s*(?:px\\s*)?$/);\n\t\tif (match) {\n\t\t\tdimensions = { x: undefined, y: undefined };\n\t\t\tif (match[1].length) {\n\t\t\t\tdimensions.x = Number(match[1]);\n\t\t\t}\n\t\t\tif (match[2] !== undefined) {\n\t\t\t\tif (onlyOne) { return null; }\n\t\t\t\tdimensions.y = Number(match[2]);\n\t\t\t}\n\t\t}\n\t\treturn dimensions;\n\t},\n\n\t// More generally, this is defined by the media handler in core\n\tvalidateMediaParam: function(num) {\n\t\treturn num !== null && num !== undefined && num > 0;\n\t},\n\n\t// Extract content in a backwards compatible way\n\tgetStar: function(revision) {\n\t\tvar content = revision;\n\t\tif (revision && revision.slots) {\n\t\t\tcontent = revision.slots.main;\n\t\t}\n\t\treturn content;\n\t},\n\n\t/**\n\t * Magic words masquerading as templates.\n\t *\n\t * @property {Set}\n\t */\n\tmagicMasqs: new Set([\"defaultsort\", \"displaytitle\"]),\n\n\t/**\n\t * This regex was generated by running through *all unicode characters* and\n\t * testing them against *all regexes* for linktrails in a default MW install.\n\t * We had to treat it a little bit, here's what we changed:\n\t *\n\t * 1. A-Z, though allowed in Walloon, is disallowed.\n\t * 2. '\"', though allowed in Chuvash, is disallowed.\n\t * 3. '-', though allowed in Icelandic (possibly due to a bug), is disallowed.\n\t * 4. '1', though allowed in Lak (possibly due to a bug), is disallowed.\n\t *\n\t * @property {RegExp}\n\t */\n\tlinkTrailRegex: new RegExp(\n\t\t'^[^\\0-`{÷ĀĈ-ČĎĐĒĔĖĚĜĝĠ-ĪĬ-įIJĴ-ĹĻ-ĽĿŀŅņʼnŊŌŎŏŒŔŖ-ŘŜŝŠŤŦŨŪ-ŬŮŲ-ŴŶŸ' +\n\t\t'ſ-ǤǦǨǪ-Ǯǰ-ȗȜ-ȞȠ-ɘɚ-ʑʓ-ʸʽ-̂̄-΅·΋΍΢Ϗ-ЯѐѝѠѢѤѦѨѪѬѮѰѲѴѶѸѺ-ѾҀ-҃҅-ҐҒҔҕҘҚҜ-ҠҤ-ҪҬҭҰҲ' +\n\t\t'Ҵ-ҶҸҹҼ-ҿӁ-ӗӚ-ӜӞӠ-ӢӤӦӪ-ӲӴӶ-ՠֈ-׏׫-ؠً-ٳٵ-ٽٿ-څڇ-ڗڙ-ڨڪ-ڬڮڰ-ڽڿ-ۅۈ-ۊۍ-۔ۖ-਀਄਋-਎਑਒' +\n\t\t'਩਱਴਷਺਻਽੃-੆੉੊੎-੘੝੟-੯ੴ-჏ჱ-ẼẾ-\\u200b\\u200d-‒—-‗‚‛”--\\ufffd]+$'),\n\n\t/**\n\t * Check whether some text is a valid link trail.\n\t *\n\t * @param {string} text\n\t * @return {boolean}\n\t */\n\tisLinkTrail: function(text) {\n\t\tif (text && text.match && text.match(this.linkTrailRegex)) {\n\t\t\treturn true;\n\t\t} else {\n\t\t\treturn false;\n\t\t}\n\t},\n\n\t/**\n\t * Convert mediawiki-format language code to a BCP47-compliant language\n\t * code suitable for including in HTML.  See\n\t * `GlobalFunctions.php::wfBCP47()` in mediawiki sources.\n\t *\n\t * @param {string} code Mediawiki language code.\n\t * @return {string} BCP47 language code.\n\t */\n\tbcp47: function(code) {\n\t\tvar codeSegment = code.split('-');\n\t\tvar codeBCP = [];\n\t\tcodeSegment.forEach(function(seg, segNo) {\n\t\t\t// When previous segment is x, it is a private segment and should be lc\n\t\t\tif (segNo > 0 && /^x$/i.test(codeSegment[segNo - 1])) {\n\t\t\t\tcodeBCP[segNo] = seg.toLowerCase();\n\t\t\t// ISO 3166 country code\n\t\t\t} else if (seg.length === 2 && segNo > 0) {\n\t\t\t\tcodeBCP[segNo] = seg.toUpperCase();\n\t\t\t// ISO 15924 script code\n\t\t\t} else if (seg.length === 4 && segNo > 0) {\n\t\t\t\tcodeBCP[segNo] = seg[0].toUpperCase() + seg.slice(1).toLowerCase();\n\t\t\t// Use lowercase for other cases\n\t\t\t} else {\n\t\t\t\tcodeBCP[segNo] = seg.toLowerCase();\n\t\t\t}\n\t\t});\n\t\treturn codeBCP.join('-');\n\t},\n};\n\nif (typeof module === \"object\") {\n\tmodule.exports.Util = Util;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/utils/WTUtils.js","messages":[{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":28,"column":2,"nodeType":"Block","endLine":35,"endColumn":5},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":40,"column":2,"nodeType":"Block","endLine":44,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":43,"column":null,"nodeType":"Block","endLine":43,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":56,"column":2,"nodeType":"Block","endLine":62,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":61,"column":null,"nodeType":"Block","endLine":61,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":67,"column":2,"nodeType":"Block","endLine":76,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"aNode\" type.","line":72,"column":null,"nodeType":"Block","endLine":72,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"dp\" type.","line":73,"column":null,"nodeType":"Block","endLine":73,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"aNode\" type.","line":74,"column":null,"nodeType":"Block","endLine":74,"endColumn":null},{"ruleId":"jsdoc/check-param-names","severity":1,"message":"Duplicate @param \"aNode\"","line":74,"column":null,"nodeType":"Block","endLine":74,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"dp\" type.","line":75,"column":null,"nodeType":"Block","endLine":75,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":124,"column":null,"nodeType":"Block","endLine":124,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":135,"column":null,"nodeType":"Block","endLine":135,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'bool' is undefined.","line":136,"column":null,"nodeType":"Block","endLine":136,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":142,"column":2,"nodeType":"Block","endLine":146,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":145,"column":null,"nodeType":"Block","endLine":145,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":151,"column":2,"nodeType":"Block","endLine":155,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":154,"column":null,"nodeType":"Block","endLine":154,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":161,"column":2,"nodeType":"Block","endLine":166,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":165,"column":null,"nodeType":"Block","endLine":165,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":172,"column":2,"nodeType":"Block","endLine":176,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":175,"column":null,"nodeType":"Block","endLine":175,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":190,"column":2,"nodeType":"Block","endLine":199,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":198,"column":null,"nodeType":"Block","endLine":198,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":212,"column":2,"nodeType":"Block","endLine":216,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":215,"column":null,"nodeType":"Block","endLine":215,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'TextNode' is undefined.","line":233,"column":null,"nodeType":"Block","endLine":233,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":264,"column":null,"nodeType":"Block","endLine":264,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":293,"column":2,"nodeType":"Block","endLine":303,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":302,"column":null,"nodeType":"Block","endLine":302,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":323,"column":2,"nodeType":"Block","endLine":331,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":330,"column":null,"nodeType":"Block","endLine":330,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":354,"column":null,"nodeType":"Block","endLine":354,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":374,"column":2,"nodeType":"Block","endLine":378,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":377,"column":null,"nodeType":"Block","endLine":377,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":383,"column":2,"nodeType":"Block","endLine":390,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":389,"column":null,"nodeType":"Block","endLine":389,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":419,"column":null,"nodeType":"Block","endLine":419,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":434,"column":2,"nodeType":"Block","endLine":441,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Frame' is undefined.","line":439,"column":null,"nodeType":"Block","endLine":439,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":440,"column":null,"nodeType":"Block","endLine":440,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":449,"column":2,"nodeType":"Block","endLine":465,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":463,"column":null,"nodeType":"Block","endLine":463,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"about\" type.","line":464,"column":null,"nodeType":"Block","endLine":464,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":490,"column":2,"nodeType":"Block","endLine":500,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"node\" type.","line":499,"column":null,"nodeType":"Block","endLine":499,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":603,"column":null,"nodeType":"Block","endLine":603,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":623,"column":2,"nodeType":"Block","endLine":631,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"typeOf\" type.","line":628,"column":null,"nodeType":"Block","endLine":628,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"attrs\" type.","line":629,"column":null,"nodeType":"Block","endLine":629,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"encode\" type.","line":630,"column":null,"nodeType":"Block","endLine":630,"endColumn":null}],"errorCount":0,"warningCount":51,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/**\n * These utilites pertain to extracting / modifying wikitext information from the DOM.\n *\n * @module\n */\n\n'use strict';\n\nconst Consts = require('../config/WikitextConstants.js').WikitextConstants;\nconst { DOMDataUtils } = require('./DOMDataUtils.js');\nconst { DOMUtils } = require('./DOMUtils.js');\nconst { JSUtils } = require('./jsutils.js');\nconst { TokenUtils } = require('./TokenUtils.js');\nconst { Util } = require('./Util.js');\n\nconst lastItem = JSUtils.lastItem;\n\n/**\n * Regexp for checking marker metas typeofs representing\n * transclusion markup or template param markup.\n *\n * @property {RegExp}\n */\nconst TPL_META_TYPE_REGEXP = /^mw:(?:Transclusion|Param)(?:\\/End)?$/;\n\nclass WTUtils {\n\n\t/**\n\t * Check whether a node's data-parsoid object includes\n\t * an indicator that the original wikitext was a literal\n\t * HTML element (like table or p).\n\t *\n\t * @param {Object} dp\n\t *   @param {string|undefined} [dp.stx]\n\t */\n\tstatic hasLiteralHTMLMarker(dp) {\n\t\treturn dp.stx === 'html';\n\t}\n\n\t/**\n\t * Run a node through {@link #hasLiteralHTMLMarker}.\n\t *\n\t * @param node\n\t */\n\tstatic isLiteralHTMLNode(node) {\n\t\treturn (node &&\n\t\t\tDOMUtils.isElt(node) &&\n\t\t\tthis.hasLiteralHTMLMarker(DOMDataUtils.getDataParsoid(node)));\n\t}\n\n\tstatic isZeroWidthWikitextElt(node) {\n\t\treturn Consts.ZeroWidthWikitextTags.has(node.nodeName) &&\n\t\t\t!this.isLiteralHTMLNode(node);\n\t}\n\n\t/**\n\t * Is `node` a block node that is also visible in wikitext?\n\t * An example of an invisible block node is a `<p>`-tag that\n\t * Parsoid generated, or a `<ul>`, `<ol>` tag.\n\t *\n\t * @param {Node} node\n\t */\n\tstatic isBlockNodeWithVisibleWT(node) {\n\t\treturn DOMUtils.isBlockNode(node) && !this.isZeroWidthWikitextElt(node);\n\t}\n\n\t/**\n\t * Helper functions to detect when an A-node uses [[..]]/[..]/... style\n\t * syntax (for wikilinks, ext links, url links). rel-type is not sufficient\n\t * anymore since mw:ExtLink is used for all the three link syntaxes.\n\t *\n\t * @param aNode\n\t * @param dp\n\t * @param aNode\n\t * @param dp\n\t */\n\tstatic usesWikiLinkSyntax(aNode, dp) {\n\t\tif (dp === undefined) {\n\t\t\tdp = DOMDataUtils.getDataParsoid(aNode);\n\t\t}\n\n\t\t// SSS FIXME: This requires to be made more robust\n\t\t// for when dp.stx value is not present\n\t\treturn aNode.getAttribute(\"rel\") === \"mw:WikiLink\" ||\n\t\t\t(dp.stx && dp.stx !== \"url\" && dp.stx !== \"magiclink\");\n\t}\n\n\tstatic usesExtLinkSyntax(aNode, dp) {\n\t\tif (dp === undefined) {\n\t\t\tdp = DOMDataUtils.getDataParsoid(aNode);\n\t\t}\n\n\t\t// SSS FIXME: This requires to be made more robust\n\t\t// for when dp.stx value is not present\n\t\treturn aNode.getAttribute(\"rel\") === \"mw:ExtLink\" &&\n\t\t\t(!dp.stx || (dp.stx !== \"url\" && dp.stx !== \"magiclink\"));\n\t}\n\n\tstatic usesURLLinkSyntax(aNode, dp) {\n\t\tif (dp === undefined) {\n\t\t\tdp = DOMDataUtils.getDataParsoid(aNode);\n\t\t}\n\n\t\t// SSS FIXME: This requires to be made more robust\n\t\t// for when dp.stx value is not present\n\t\treturn aNode.getAttribute(\"rel\") === \"mw:ExtLink\" &&\n\t\t\tdp.stx && dp.stx === \"url\";\n\t}\n\n\tstatic usesMagicLinkSyntax(aNode, dp) {\n\t\tif (dp === undefined) {\n\t\t\tdp = DOMDataUtils.getDataParsoid(aNode);\n\t\t}\n\n\t\t// SSS FIXME: This requires to be made more robust\n\t\t// for when dp.stx value is not present\n\t\treturn aNode.getAttribute(\"rel\") === \"mw:ExtLink\" &&\n\t\t\tdp.stx && dp.stx === \"magiclink\";\n\t}\n\n\t/**\n\t * Check whether a node's typeof indicates that it is a template expansion.\n\t *\n\t * @param {Node} node\n\t * @return {string|null} The matched type, or null if no match.\n\t */\n\tstatic matchTplType(node) {\n\t\treturn DOMUtils.matchTypeOf(node, TPL_META_TYPE_REGEXP);\n\t}\n\n\t/**\n\t * Check whether a typeof indicates that it signifies an\n\t * expanded attribute.\n\t *\n\t * @param node\n\t * @return {bool}\n\t */\n\tstatic hasExpandedAttrsType(node) {\n\t\treturn DOMUtils.matchTypeOf(node, /^mw:ExpandedAttrs(\\/[^\\s]+)*$/) !== null;\n\t}\n\n\t/**\n\t * Check whether a node is a meta tag that signifies a template expansion.\n\t *\n\t * @param node\n\t */\n\tstatic isTplMarkerMeta(node) {\n\t\treturn DOMUtils.matchNameAndTypeOf(node, 'META', TPL_META_TYPE_REGEXP) !== null;\n\t}\n\n\t/**\n\t * Check whether a node is a meta signifying the start of a template expansion.\n\t *\n\t * @param node\n\t */\n\tstatic isTplStartMarkerMeta(node) {\n\t\tvar t = DOMUtils.matchNameAndTypeOf(node, 'META', TPL_META_TYPE_REGEXP);\n\t\treturn t && !/\\/End$/.test(t);\n\t}\n\n\t/**\n\t * Check whether a node is a meta signifying the end of a template\n\t * expansion.\n\t *\n\t * @param {Node} n\n\t */\n\tstatic isTplEndMarkerMeta(n) {\n\t\tvar t = DOMUtils.matchNameAndTypeOf(n, 'META', TPL_META_TYPE_REGEXP);\n\t\treturn t && /\\/End$/.test(t);\n\t}\n\n\t/**\n\t * Find the first wrapper element of encapsulated content.\n\t *\n\t * @param node\n\t */\n\tstatic findFirstEncapsulationWrapperNode(node) {\n\t\tif (!this.hasParsoidAboutId(node)) {\n\t\t\treturn null;\n\t\t}\n\t\tvar about = node.getAttribute('about') || '';\n\t\tvar prev = node;\n\t\tdo {\n\t\t\tnode = prev;\n\t\t\tprev = DOMUtils.previousNonDeletedSibling(node);\n\t\t} while (prev && DOMUtils.isElt(prev) && prev.getAttribute('about') === about);\n\t\treturn this.isFirstEncapsulationWrapperNode(node) ? node : null;\n\t}\n\n\t/**\n\t * This tests whether a DOM node is a new node added during an edit session\n\t * or an existing node from parsed wikitext.\n\t *\n\t * As written, this function can only be used on non-template/extension content\n\t * or on the top-level nodes of template/extension content. This test will\n\t * return the wrong results on non-top-level nodes of template/extension content.\n\t *\n\t * @param {Node} node\n\t */\n\tstatic isNewElt(node) {\n\t\t// We cannot determine newness on text/comment nodes.\n\t\tif (!DOMUtils.isElt(node)) {\n\t\t\treturn false;\n\t\t}\n\n\t\t// For template/extension content, newness should be\n\t\t// checked on the encapsulation wrapper node.\n\t\tnode = this.findFirstEncapsulationWrapperNode(node) || node;\n\t\treturn !!DOMDataUtils.getDataParsoid(node).tmp.isNew;\n\t}\n\n\t/**\n\t * Check whether a pre is caused by indentation in the original wikitext.\n\t *\n\t * @param node\n\t */\n\tstatic isIndentPre(node) {\n\t\treturn node.nodeName === \"PRE\" && !this.isLiteralHTMLNode(node);\n\t}\n\n\tstatic isInlineMedia(n) {\n\t\treturn DOMUtils.matchNameAndTypeOf(n, 'FIGURE-INLINE', /^mw:(?:Image|Video|Audio)($|\\/)/) !== null;\n\t}\n\n\tstatic isGeneratedFigure(n) {\n\t\treturn DOMUtils.matchTypeOf(n, /^mw:(?:Image|Video|Audio)($|\\/)/) !== null;\n\t}\n\n\t/**\n\t * Find how much offset is necessary for the DSR of an\n\t * indent-originated pre tag.\n\t *\n\t * @param {TextNode} textNode\n\t * @return {number}\n\t */\n\tstatic indentPreDSRCorrection(textNode) {\n\t\t// NOTE: This assumes a text-node and doesn't check that it is one.\n\t\t//\n\t\t// FIXME: Doesn't handle text nodes that are not direct children of the pre\n\t\tif (this.isIndentPre(textNode.parentNode)) {\n\t\t\tvar numNLs;\n\t\t\tif (textNode.parentNode.lastChild === textNode) {\n\t\t\t\t// We dont want the trailing newline of the last child of the pre\n\t\t\t\t// to contribute a pre-correction since it doesn't add new content\n\t\t\t\t// in the pre-node after the text\n\t\t\t\tnumNLs = (textNode.nodeValue.match(/\\n./g) || []).length;\n\t\t\t} else {\n\t\t\t\tnumNLs = (textNode.nodeValue.match(/\\n/g) || []).length;\n\t\t\t}\n\t\t\treturn numNLs;\n\t\t} else {\n\t\t\treturn 0;\n\t\t}\n\t}\n\n\t/**\n\t * Check if node is an ELEMENT node belongs to a template/extension.\n\t *\n\t * NOTE: Use with caution. This technique works reliably for the\n\t * root level elements of tpl-content DOM subtrees since only they\n\t * are guaranteed to be  marked and nested content might not\n\t * necessarily be marked.\n\t *\n\t * @param {Node} node\n\t * @return {boolean}\n\t */\n\tstatic hasParsoidAboutId(node) {\n\t\tif (DOMUtils.isElt(node)) {\n\t\t\tvar about = node.getAttribute('about') || '';\n\t\t\t// SSS FIXME: Verify that our DOM spec clarifies this\n\t\t\t// expectation on about-ids and that our clients respect this.\n\t\t\treturn about && Util.isParsoidObjectId(about);\n\t\t} else {\n\t\t\treturn false;\n\t\t}\n\t}\n\n\tstatic isRedirectLink(node) {\n\t\treturn DOMUtils.isElt(node) && node.nodeName === 'LINK' &&\n\t\t\t/\\bmw:PageProp\\/redirect\\b/.test(node.getAttribute('rel') || '');\n\t}\n\n\tstatic isCategoryLink(node) {\n\t\treturn DOMUtils.isElt(node) && node.nodeName === 'LINK' &&\n\t\t\t/\\bmw:PageProp\\/Category\\b/.test(node.getAttribute('rel') || '');\n\t}\n\n\tstatic isSolTransparentLink(node) {\n\t\treturn DOMUtils.isElt(node) && node.nodeName === 'LINK' &&\n\t\t\tTokenUtils.solTransparentLinkRegexp.test(node.getAttribute('rel') || '');\n\t}\n\n\t/**\n\t * Check if 'node' emits wikitext that is sol-transparent in wikitext form.\n\t * This is a test for wikitext that doesn't introduce line breaks.\n\t *\n\t * Comment, whitespace text nodes, category links, redirect links, behavior\n\t * switches, and include directives currently satisfy this definition.\n\t *\n\t * This should come close to matching TokenUtils.isSolTransparent()\n\t *\n\t * @param {Node} node\n\t */\n\tstatic emitsSolTransparentSingleLineWT(node) {\n\t\tif (DOMUtils.isText(node)) {\n\t\t\t// NB: We differ here to meet the nl condition.\n\t\t\treturn node.nodeValue.match(/^[ \\t]*$/);\n\t\t} else if (this.isRenderingTransparentNode(node)) {\n\t\t\t// NB: The only metas in a DOM should be for behavior switches and\n\t\t\t// include directives, other than explicit HTML meta tags. This\n\t\t\t// differs from our counterpart in Util where ref meta tokens\n\t\t\t// haven't been expanded to spans yet.\n\t\t\treturn true;\n\t\t} else {\n\t\t\treturn false;\n\t\t}\n\t}\n\n\tstatic isFallbackIdSpan(node) {\n\t\treturn DOMUtils.hasNameAndTypeOf(node, 'SPAN', 'mw:FallbackId');\n\t}\n\n\t/**\n\t * These are primarily 'metadata'-like nodes that don't show up in output rendering.\n\t * - In Parsoid output, they are represented by link/meta tags.\n\t * - In the PHP parser, they are completely stripped from the input early on.\n\t * Because of this property, these rendering-transparent nodes are also\n\t * SOL-transparent for the purposes of parsing behavior.\n\t *\n\t * @param node\n\t */\n\tstatic isRenderingTransparentNode(node) {\n\t\t// FIXME: Can we change this entire thing to\n\t\t// DOMUtils.isComment(node) ||\n\t\t// DOMUtils.getDataParsoid(node).stx !== 'html' &&\n\t\t//   (node.nodeName === 'META' || node.nodeName === 'LINK')\n\t\t//\n\t\treturn DOMUtils.isComment(node) ||\n\t\t\tthis.isSolTransparentLink(node) ||\n\t\t\t// Catch-all for everything else.\n\t\t\t(node.nodeName === 'META' &&\n\t\t\t\t// (Start|End)Tag metas clone data-parsoid from the tokens\n\t\t\t\t// they're shadowing, which trips up on the stx check.\n\t\t\t\t// TODO: Maybe that data should be nested in a property?\n\t\t\t\t(DOMUtils.matchTypeOf(node, /^mw:(StartTag|EndTag)$/) !== null ||\n\t\t\t\tDOMDataUtils.getDataParsoid(node).stx !== 'html')) ||\n\t\t\tthis.isFallbackIdSpan(node);\n\t}\n\n\t/**\n\t * Is node nested inside a table tag that uses HTML instead of native\n\t * wikitext?\n\t *\n\t * @param {Node} node\n\t * @return {boolean}\n\t */\n\tstatic inHTMLTableTag(node) {\n\t\tvar p = node.parentNode;\n\t\twhile (DOMUtils.isTableTag(p)) {\n\t\t\tif (this.isLiteralHTMLNode(p)) {\n\t\t\t\treturn true;\n\t\t\t} else if (p.nodeName === 'TABLE') {\n\t\t\t\t// Don't cross <table> boundaries\n\t\t\t\treturn false;\n\t\t\t}\n\t\t\tp = p.parentNode;\n\t\t}\n\n\t\treturn false;\n\t}\n\n\tstatic FIRST_ENCAP_REGEXP() { return /(?:^|\\s)(mw:(?:Transclusion|Param|LanguageVariant|Extension(\\/[^\\s]+)))(?=$|\\s)/; }\n\n\t/**\n\t * Is node the first wrapper element of encapsulated content?\n\t *\n\t * @param node\n\t */\n\tstatic isFirstEncapsulationWrapperNode(node) {\n\t\treturn DOMUtils.matchTypeOf(node, this.FIRST_ENCAP_REGEXP()) !== null;\n\t}\n\n\t/**\n\t * Is node an encapsulation wrapper elt?\n\t *\n\t * All root-level nodes of generated content are considered\n\t * encapsulation wrappers and share an about-id.\n\t *\n\t * @param node\n\t */\n\tstatic isEncapsulationWrapper(node) {\n\t\t// True if it has an encapsulation type or while walking backwards\n\t\t// over elts with identical about ids, we run into a node with an\n\t\t// encapsulation type.\n\t\tif (!DOMUtils.isElt(node)) {\n\t\t\treturn false;\n\t\t}\n\n\t\treturn this.findFirstEncapsulationWrapperNode(node) !== null;\n\t}\n\n\tstatic isDOMFragmentWrapper(node) {\n\t\treturn DOMUtils.isElt(node) &&\n\t\t\tTokenUtils.isDOMFragmentType(node.getAttribute('typeof') || '');\n\t}\n\n\tstatic isSealedFragmentOfType(node, type) {\n\t\treturn DOMUtils.hasTypeOf(node, 'mw:DOMFragment/sealed/' + type);\n\t}\n\n\tstatic isParsoidSectionTag(node) {\n\t\treturn node.nodeName === 'SECTION' &&\n\t\t\tnode.hasAttribute('data-mw-section-id');\n\t}\n\n\t/**\n\t * Is the node from extension content?\n\t *\n\t * @param {Node} node\n\t * @param {string} extType\n\t * @return {boolean}\n\t */\n\tstatic fromExtensionContent(node, extType) {\n\t\tvar parentNode = node.parentNode;\n\t\twhile (parentNode && !DOMUtils.atTheTop(parentNode)) {\n\t\t\tif (DOMUtils.hasTypeOf(parentNode, 'mw:Extension/' + extType)) {\n\t\t\t\treturn true;\n\t\t\t}\n\t\t\tparentNode = parentNode.parentNode;\n\t\t}\n\t\treturn false;\n\t}\n\n\t/**\n\t * Compute, when possible, the wikitext source for a node in\n\t * an frame f. Returns null if the source cannot be\n\t * extracted.\n\t *\n\t * @param {Frame} frame\n\t * @param {Node} node\n\t */\n\tstatic getWTSource(frame, node) {\n\t\tvar data = DOMDataUtils.getDataParsoid(node);\n\t\tvar dsr = (undefined !== data) ? data.dsr : null;\n\t\treturn dsr && Util.isValidDSR(dsr) ?\n\t\t\tframe.srcText.substring(dsr[0], dsr[1]) : null;\n\t}\n\n\t/**\n\t * Gets all siblings that follow 'node' that have an 'about' as\n\t * their about id.\n\t *\n\t * This is used to fetch transclusion/extension content by using\n\t * the about-id as the key.  This works because\n\t * transclusion/extension content is a forest of dom-trees formed\n\t * by adjacent dom-nodes.  This is the contract that template\n\t * encapsulation, dom-reuse, and VE code all have to abide by.\n\t *\n\t * The only exception to this adjacency rule is IEW nodes in\n\t * fosterable positions (in tables) which are not span-wrapped to\n\t * prevent them from getting fostered out.\n\t *\n\t * @param node\n\t * @param about\n\t */\n\tstatic getAboutSiblings(node, about) {\n\t\tvar nodes = [node];\n\n\t\tif (!about) {\n\t\t\treturn nodes;\n\t\t}\n\n\t\tnode = node.nextSibling;\n\t\twhile (node && (\n\t\t\tDOMUtils.isElt(node) && node.getAttribute('about') === about ||\n\t\t\t\tDOMUtils.isFosterablePosition(node) && !DOMUtils.isElt(node) && DOMUtils.isIEW(node)\n\t\t)) {\n\t\t\tnodes.push(node);\n\t\t\tnode = node.nextSibling;\n\t\t}\n\n\t\t// Remove already consumed trailing IEW, if any\n\t\twhile (nodes.length && DOMUtils.isIEW(lastItem(nodes))) {\n\t\t\tnodes.pop();\n\t\t}\n\n\t\treturn nodes;\n\t}\n\n\t/**\n\t * This function is only intended to be used on encapsulated nodes\n\t * (Template/Extension/Param content).\n\t *\n\t * Given a 'node' that has an about-id, it is assumed that it is generated\n\t * by templates or extensions.  This function skips over all\n\t * following content nodes and returns the first non-template node\n\t * that follows it.\n\t *\n\t * @param node\n\t */\n\tstatic skipOverEncapsulatedContent(node) {\n\t\tif (node.hasAttribute('about')) {\n\t\t\tvar about = node.getAttribute('about');\n\t\t\treturn lastItem(this.getAboutSiblings(node, about)).nextSibling;\n\t\t} else {\n\t\t\treturn node.nextSibling;\n\t\t}\n\t}\n\n\t// Comment encoding/decoding.\n\t//\n\t//  * Some relevant phab tickets: T94055, T70146, T60184, T95039\n\t//\n\t// The wikitext comment rule is very simple: <!-- starts a comment,\n\t// and --> ends a comment.  This means we can have almost anything as the\n\t// contents of a comment (except the string \"-->\", but see below), including\n\t// several things that are not valid in HTML5 comments:\n\t//\n\t//  * For one, the html5 comment parsing algorithm [0] leniently accepts\n\t//    --!> as a closing comment tag, which differs from the php+tidy combo.\n\t//\n\t//  * If the comment's data matches /^-?>/, html5 will end the comment.\n\t//    For example, <!-->stuff<--> breaks up as\n\t//    <!--> (the comment) followed by, stuff<--> (as text).\n\t//\n\t//  * Finally, comment data shouldn't contain two consecutive hyphen-minus\n\t//    characters (--), nor end in a hyphen-minus character (/-$/) as defined\n\t//    in the spec [1].\n\t//\n\t// We work around all these problems by using HTML entity encoding inside\n\t// the comment body.  The characters -, >, and & must be encoded in order\n\t// to prevent premature termination of the comment by one of the cases\n\t// above.  Encoding other characters is optional; all entities will be\n\t// decoded during wikitext serialization.\n\t//\n\t// In order to allow *arbitrary* content inside a wikitext comment,\n\t// including the forbidden string \"-->\" we also do some minimal entity\n\t// decoding on the wikitext.  We are also limited by our inability\n\t// to encode DSR attributes on the comment node, so our wikitext entity\n\t// decoding must be 1-to-1: that is, there must be a unique \"decoded\"\n\t// string for every wikitext sequence, and for every decoded string there\n\t// must be a unique wikitext which creates it.\n\t//\n\t// The basic idea here is to replace every string ab*c with the string with\n\t// one more b in it.  This creates a string with no instance of \"ac\",\n\t// so you can use 'ac' to encode one more code point.  In this case\n\t// a is \"--&\", \"b\" is \"amp;\", and \"c\" is \"gt;\" and we use ac to\n\t// encode \"-->\" (which is otherwise unspeakable in wikitext).\n\t//\n\t// Note that any user content which does not match the regular\n\t// expression /--(>|&(amp;)*gt;)/ is unchanged in its wikitext\n\t// representation, as shown in the first two examples below.\n\t//\n\t// User-authored comment text    Wikitext       HTML5 DOM\n\t// --------------------------    -------------  ----------------------\n\t// & - >                         & - >          &amp; &#43; &gt;\n\t// Use &gt; here                 Use &gt; here  Use &amp;gt; here\n\t// -->                           --&gt;         &#43;&#43;&gt;\n\t// --&gt;                        --&amp;gt;     &#43;&#43;&amp;gt;\n\t// --&amp;gt;                    --&amp;amp;gt; &#43;&#43;&amp;amp;gt;\n\t//\n\t// [0] http://www.w3.org/TR/html5/syntax.html#comment-start-state\n\t// [1] http://www.w3.org/TR/html5/syntax.html#comments\n\n\t/**\n\t * Map a wikitext-escaped comment to an HTML DOM-escaped comment.\n\t *\n\t * @param {string} comment Wikitext-escaped comment.\n\t * @return {string} DOM-escaped comment.\n\t */\n\tstatic encodeComment(comment) {\n\t\t// Undo wikitext escaping to obtain \"true value\" of comment.\n\t\tvar trueValue = comment\n\t\t\t.replace(/--&(amp;)*gt;/g, Util.decodeWtEntities);\n\t\t// Now encode '-', '>' and '&' in the \"true value\" as HTML entities,\n\t\t// so that they can be safely embedded in an HTML comment.\n\t\t// This part doesn't have to map strings 1-to-1.\n\t\treturn trueValue\n\t\t\t.replace(/[->&]/g, Util.entityEncodeAll);\n\t}\n\n\t/**\n\t * Map an HTML DOM-escaped comment to a wikitext-escaped comment.\n\t *\n\t * @param {string} comment DOM-escaped comment.\n\t * @return {string} Wikitext-escaped comment.\n\t */\n\tstatic decodeComment(comment) {\n\t\t// Undo HTML entity escaping to obtain \"true value\" of comment.\n\t\tvar trueValue = Util.decodeWtEntities(comment);\n\t\t// ok, now encode this \"true value\" of the comment in such a way\n\t\t// that the string \"-->\" never shows up.  (See above.)\n\t\treturn trueValue\n\t\t\t.replace(/--(&(amp;)*gt;|>)/g, function(s) {\n\t\t\t\treturn s === '-->' ? '--&gt;' : '--&amp;' + s.slice(3);\n\t\t\t});\n\t}\n\n\t/**\n\t * Utility function: we often need to know the wikitext DSR length for\n\t * an HTML DOM comment value.\n\t *\n\t * @param {Node} node A comment node containing a DOM-escaped comment.\n\t * @return {number} The wikitext length necessary to encode this comment,\n\t *   including 7 characters for the `<!--` and `-->` delimiters.\n\t */\n\tstatic decodedCommentLength(node) {\n\t\tconsole.assert(DOMUtils.isComment(node));\n\t\t// Add 7 for the \"<!--\" and \"-->\" delimiters in wikitext.\n\t\treturn this.decodeComment(node.data).length + 7;\n\t}\n\n\t/**\n\t * Escape `<nowiki>` tags.\n\t *\n\t * @param {string} text\n\t * @return {string}\n\t */\n\tstatic escapeNowikiTags(text) {\n\t\treturn text.replace(/<(\\/?nowiki\\s*\\/?\\s*)>/gi, '&lt;$1&gt;');\n\t}\n\n\t/**\n\t * Conditional encoding is because, while treebuilding, the value goes\n\t * directly from token to dom node without the comment itself being\n\t * stringified and parsed where the comment encoding would be necessary.\n\t *\n\t * @param typeOf\n\t * @param attrs\n\t * @param encode\n\t */\n\tstatic fosterCommentData(typeOf, attrs, encode) {\n\t\tlet str = JSON.stringify({\n\t\t\t'@type': typeOf,\n\t\t\tattrs,\n\t\t});\n\t\tif (encode) { str = WTUtils.encodeComment(str); }\n\t\treturn str;\n\t}\n\n\tstatic reinsertFosterableContent(env, node, decode) {\n\t\tif (DOMUtils.isComment(node) && /^\\{[^]+\\}$/.test(node.data)) {\n\t\t\t// Convert serialized meta tags back from comments.\n\t\t\t// We use this trick because comments won't be fostered,\n\t\t\t// providing more accurate information about where tags are expected\n\t\t\t// to be found.\n\t\t\tvar data, type;\n\t\t\ttry {\n\t\t\t\tdata = JSON.parse(decode ? WTUtils.decodeComment(node.data) : node.data);\n\t\t\t\ttype = data[\"@type\"];\n\t\t\t} catch (e) {\n\t\t\t\t// not a valid json attribute, do nothing\n\t\t\t\treturn null;\n\t\t\t}\n\t\t\tif (/^mw:/.test(type)) {\n\t\t\t\tvar meta = node.ownerDocument.createElement(\"meta\");\n\t\t\t\tdata.attrs.forEach(function(attr) {\n\t\t\t\t\ttry {\n\t\t\t\t\t\tmeta.setAttribute(...attr);\n\t\t\t\t\t} catch (e) {\n\t\t\t\t\t\tenv.log(\"warn\", \"prepareDOM: Dropped invalid attribute\", JSON.stringify(attr));\n\t\t\t\t\t}\n\t\t\t\t});\n\t\t\t\tnode.parentNode.replaceChild(meta, node);\n\t\t\t\treturn meta;\n\t\t\t}\n\t\t}\n\t\treturn null;\n\t}\n\n\tstatic getNativeExt(env, node) {\n\t\tconst prefixLen = \"mw:Extension/\".length;\n\t\tconst match = DOMUtils.matchTypeOf(node, /^mw:Extension\\/(.+?)$/);\n\t\treturn match && env.conf.wiki.extConfig.tags.get(match.slice(prefixLen));\n\t}\n}\n\nif (typeof module === \"object\") {\n\tmodule.exports.WTUtils = WTUtils;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/utils/jsutils.js","messages":[{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":59,"column":2,"nodeType":"Block","endLine":68,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"it\" type.","line":66,"column":null,"nodeType":"Block","endLine":66,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"freezeEntries\" type.","line":67,"column":null,"nodeType":"Block","endLine":67,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":83,"column":2,"nodeType":"Block","endLine":91,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"it\" type.","line":86,"column":null,"nodeType":"Block","endLine":86,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"freezeEntries\" type.","line":87,"column":null,"nodeType":"Block","endLine":87,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"it\" type.","line":88,"column":null,"nodeType":"Block","endLine":88,"endColumn":null},{"ruleId":"jsdoc/check-param-names","severity":1,"message":"Duplicate @param \"it\"","line":88,"column":null,"nodeType":"Block","endLine":88,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"freezeEntries\" type.","line":89,"column":null,"nodeType":"Block","endLine":89,"endColumn":null},{"ruleId":"jsdoc/require-returns-check","severity":1,"message":"JSDoc @return declaration present but return expression not available in function.","line":141,"column":2,"nodeType":"Block","endLine":147,"endColumn":5},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":160,"column":2,"nodeType":"Block","endLine":165,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"obj\" type.","line":164,"column":null,"nodeType":"Block","endLine":164,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":180,"column":2,"nodeType":"Block","endLine":186,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"n\" type.","line":185,"column":null,"nodeType":"Block","endLine":185,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":273,"column":2,"nodeType":"Block","endLine":281,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"accum\" type.","line":277,"column":null,"nodeType":"Block","endLine":277,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"arr\" type.","line":278,"column":null,"nodeType":"Block","endLine":278,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"accum\" type.","line":279,"column":null,"nodeType":"Block","endLine":279,"endColumn":null},{"ruleId":"jsdoc/check-param-names","severity":1,"message":"Duplicate @param \"accum\"","line":279,"column":null,"nodeType":"Block","endLine":279,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"arr\" type.","line":280,"column":null,"nodeType":"Block","endLine":280,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Found more than one @return declaration.","line":294,"column":2,"nodeType":"Block","endLine":372,"endColumn":5},{"ruleId":"jsdoc/require-returns-check","severity":1,"message":"Found more than one @return declaration.","line":294,"column":2,"nodeType":"Block","endLine":372,"endColumn":5},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'true' is undefined.","line":368,"column":null,"nodeType":"Block","endLine":368,"endColumn":null}],"errorCount":0,"warningCount":23,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/**\n * This file contains Parsoid-independent JS helper functions.\n * Over time, more functions can be migrated out of various other files here.\n *\n * @module\n */\n\n'use strict';\n\nrequire('../../core-upgrade.js');\n\nvar Promise = require('./promise.js');\n\nvar rejectMutation = function() {\n\tthrow new TypeError(\"Mutation attempted on read-only collection.\");\n};\n\nvar lastItem = function(array) {\n\tconsole.assert(Array.isArray(array));\n\treturn array[array.length - 1];\n};\n\n/** @namespace */\nvar JSUtils = {\n\n\t/**\n\t * Return the last item in an array.\n\t *\n\t * @method\n\t * @param {Array} array\n\t * @return {any} The last item in `array`\n\t */\n\tlastItem: lastItem,\n\n\t/**\n\t * Return a {@link Map} with the same initial keys and values as the\n\t * given {@link Object}.\n\t *\n\t * @param {Object} obj\n\t * @return {Map}\n\t */\n\tmapObject: function(obj) {\n\t\treturn new Map(Object.entries(obj));\n\t},\n\n\t/**\n\t * Return a two-way Map that maps each element to its index\n\t * (and vice-versa).\n\t *\n\t * @param {Array} arr\n\t * @return {Map}\n\t */\n\tarrayMap: function(arr) {\n\t\tvar m = new Map(arr.map(function(e, i) { return [e, i]; }));\n\t\tm.item = function(i) { return arr[i]; };\n\t\treturn m;\n\t},\n\n\t/**\n\t * ES6 maps/sets are still writable even when frozen, because they\n\t * store data inside the object linked from an internal slot.\n\t * This freezes a map by disabling the mutation methods, although\n\t * it's not bulletproof: you could use `Map.prototype.set.call(m, ...)`\n\t * to still mutate the backing store.\n\t *\n\t * @param it\n\t * @param freezeEntries\n\t */\n\tfreezeMap: function(it, freezeEntries) {\n\t\t// Allow `it` to be an iterable, as well as a map.\n\t\tif (!(it instanceof Map)) { it = new Map(it); }\n\t\tit.set = it.clear = it.delete = rejectMutation;\n\t\tObject.freeze(it);\n\t\tif (freezeEntries) {\n\t\t\tit.forEach(function(v, k) {\n\t\t\t\tJSUtils.deepFreeze(v);\n\t\t\t\tJSUtils.deepFreeze(k);\n\t\t\t});\n\t\t}\n\t\treturn it;\n\t},\n\n\t/**\n\t * This makes a set read-only.\n\t *\n\t * @param it\n\t * @param freezeEntries\n\t * @param it\n\t * @param freezeEntries\n\t * @see {@link .freezeMap}\n\t */\n\tfreezeSet: function(it, freezeEntries) {\n\t\t// Allow `it` to be an iterable, as well as a set.\n\t\tif (!(it instanceof Set)) { it = new Set(it); }\n\t\tit.add = it.clear = it.delete = rejectMutation;\n\t\tObject.freeze(it);\n\t\tif (freezeEntries) {\n\t\t\tit.forEach(function(v) {\n\t\t\t\tJSUtils.deepFreeze(v);\n\t\t\t});\n\t\t}\n\t\treturn it;\n\t},\n\n\t/**\n\t * Deep-freeze an object.\n\t * {@link Map}s and {@link Set}s are handled with {@link .freezeMap} and\n\t * {@link .freezeSet}.\n\t *\n\t * @see https://developer.mozilla.org/en-US/docs/JavaScript/Reference/Global_Objects/Object/freeze\n\t * @param {any} o\n\t * @return {any} Frozen object\n\t */\n\tdeepFreeze: function(o) {\n\t\tif (!(o instanceof Object)) {\n\t\t\treturn o;\n\t\t} else if (Object.isFrozen(o)) {\n\t\t\t// Note that this might leave an unfrozen reference somewhere in\n\t\t\t// the object if there is an already frozen object containing an\n\t\t\t// unfrozen object.\n\t\t\treturn o;\n\t\t} else if (o instanceof Map) {\n\t\t\treturn JSUtils.freezeMap(o, true);\n\t\t} else if (o instanceof Set) {\n\t\t\treturn JSUtils.freezeSet(o, true);\n\t\t}\n\n\t\tObject.freeze(o);\n\t\tfor (var propKey in o) {\n\t\t\tvar desc = Object.getOwnPropertyDescriptor(o, propKey);\n\t\t\tif ((!desc) || desc.get || desc.set) {\n\t\t\t\t// If the object is on the prototype or is a getter, skip it.\n\t\t\t\tcontinue;\n\t\t\t}\n\t\t\t// Recursively call deepFreeze.\n\t\t\tJSUtils.deepFreeze(desc.value);\n\t\t}\n\t\treturn o;\n\t},\n\n\t/**\n\t * Deep freeze an object, except for the specified fields.\n\t *\n\t * @param {Object} o\n\t * @param {Object} ignoreFields\n\t * @return {Object} Frozen object.\n\t */\n\tdeepFreezeButIgnore: function(o, ignoreFields) {\n\t\tfor (var prop in o) {\n\t\t\tvar desc = Object.getOwnPropertyDescriptor(o, prop);\n\t\t\tif (ignoreFields[prop] === true || (!desc) || desc.get || desc.set) {\n\t\t\t\t// Ignore getters, primitives, and explicitly ignored fields.\n\t\t\t\treturn;\n\t\t\t}\n\t\t\to[prop] = JSUtils.deepFreeze(desc.value);\n\t\t}\n\t\tObject.freeze(o);\n\t},\n\n\t/**\n\t * Sort keys in an object, recursively, for better reproducibility.\n\t * (This is especially useful before serializing as JSON.)\n\t *\n\t * @param obj\n\t */\n\tsortObject: function(obj) {\n\t\tvar sortObject = JSUtils.sortObject;\n\t\tvar sortValue = function(v) {\n\t\t\tif (v instanceof Object) {\n\t\t\t\treturn Array.isArray(v) ? v.map(sortValue) : sortObject(v);\n\t\t\t}\n\t\t\treturn v;\n\t\t};\n\t\treturn Object.keys(obj).sort().reduce(function(sorted, k) {\n\t\t\tsorted[k] = sortValue(obj[k]);\n\t\t\treturn sorted;\n\t\t}, {});\n\t},\n\n\t/**\n\t * Convert a counter to a Base64 encoded string.\n\t * Padding is stripped. \\,+ are replaced with _,- respectively.\n\t * Warning: Max integer is 2^31 - 1 for bitwise operations.\n\t *\n\t * @param n\n\t */\n\tcounterToBase64: function(n) {\n\t\t/* eslint-disable no-bitwise */\n\t\tvar arr = [];\n\t\tdo {\n\t\t\tarr.unshift(n & 0xff);\n\t\t\tn >>= 8;\n\t\t} while (n > 0);\n\t\treturn (Buffer.from(arr))\n\t\t\t.toString(\"base64\")\n\t\t\t.replace(/=/g, \"\")\n\t\t\t.replace(/\\//g, \"_\")\n\t\t\t.replace(/\\+/g, \"-\");\n\t\t/* eslint-enable no-bitwise */\n\t},\n\n\t/**\n\t * Escape special regexp characters in a string.\n\t *\n\t * @param {string} s\n\t * @return {string} A regular expression string that matches the\n\t *  literal characters in s.\n\t */\n\tescapeRegExp: function(s) {\n\t\treturn s.replace(/[\\^\\\\$*+?.()|{}\\[\\]\\/]/g, '\\\\$&');\n\t},\n\n\t/**\n\t * Escape special regexp characters in a string, returning a\n\t * case-insensitive regular expression.  This is usually denoted\n\t * by something like `(?i:....)` in most programming languages,\n\t * but JavaScript doesn't support embedded regexp flags.\n\t *\n\t * @param {string} s\n\t * @return {string} A regular expression string that matches the\n\t *  literal characters in s.\n\t */\n\tescapeRegExpIgnoreCase: function(s) {\n\t\t// Using Array.from() here ensures we split on unicode codepoints,\n\t\t// which may be longer than a single JavaScript character.\n\t\treturn Array.from(s).map((c) => {\n\t\t\tif (/[\\^\\\\$*+?.()|{}\\[\\]\\/]/.test(c)) { return '\\\\' + c; }\n\t\t\tconst uc = c.toUpperCase();\n\t\t\tconst lc = c.toLowerCase();\n\t\t\tif (c === lc && c === uc) { return c; }\n\t\t\tif (uc.length === 1 && lc.length === 1) { return `[${uc}${lc}]`; }\n\t\t\treturn `(?:${uc}|${lc})`;\n\t\t}).join('');\n\t},\n\n\t/**\n\t * Join pieces of regular expressions together.  This helps avoid\n\t * having to switch between string and regexp quoting rules, and\n\t * can also give you a poor-man's version of the \"x\" flag, ie:\n\t * ```\n\t *  var re = rejoin( \"(\",\n\t *      /foo|bar/, \"|\",\n\t *      someRegExpFromAVariable\n\t *      \")\", { flags: \"i\" } );\n\t * ```\n\t * Note that this is basically string concatenation, except that\n\t * regular expressions are converted to strings using their `.source`\n\t * property, and then the final resulting string is converted to a\n\t * regular expression.\n\t *\n\t * If the final argument is a regular expression, its flags will be\n\t * used for the result.  Alternatively, you can make the final argument\n\t * an object, with a `flags` property (as shown in the example above).\n\t *\n\t * @return {RegExp}\n\t */\n\trejoin: function() {\n\t\tvar regexps = Array.from(arguments);\n\t\tvar last = lastItem(regexps);\n\t\tvar flags;\n\t\tif (typeof (last) === 'object') {\n\t\t\tif (last instanceof RegExp) {\n\t\t\t\tflags = /\\/([gimy]*)$/.exec(last.toString())[1];\n\t\t\t} else {\n\t\t\t\tflags = regexps.pop().flags;\n\t\t\t}\n\t\t}\n\t\treturn new RegExp(regexps.reduce(function(acc, r) {\n\t\t\treturn acc + (r instanceof RegExp ? r.source : r);\n\t\t}, ''), flags === undefined ? '' : flags);\n\t},\n\n\t/**\n\t * Append an array to an accumulator using the most efficient method\n\t * available. Makes sure that accumulation is O(n).\n\t *\n\t * @param accum\n\t * @param arr\n\t * @param accum\n\t * @param arr\n\t */\n\tpushArray: function push(accum, arr) {\n\t\tif (accum.length < arr.length) {\n\t\t\treturn accum.concat(arr);\n\t\t} else {\n\t\t\t// big accum & arr\n\t\t\tfor (var i = 0, l = arr.length; i < l; i++) {\n\t\t\t\taccum.push(arr[i]);\n\t\t\t}\n\t\t\treturn accum;\n\t\t}\n\t},\n\n\t/**\n\t * Helper function to ease migration to Promise-based control flow\n\t * (aka, \"after years of wandering, arrive in the Promise land\").\n\t * This function allows retrofitting an existing callback-based\n\t * method to return an equivalent Promise, allowing enlightened\n\t * new code to omit the callback parameter and treat it as if\n\t * it had an API which simply returned a Promise for the result.\n\t *\n\t * Sample use:\n\t * ```\n\t *   // callback is node-style: callback(err, value)\n\t *   function legacyApi(param1, param2, callback) {\n\t *     callback = JSUtils.mkPromised(callback); // THIS LINE IS NEW\n\t *     ... some implementation here...\n\t *     return callback.promise; // THIS LINE IS NEW\n\t *   }\n\t *   // old-style caller, still works:\n\t *   legacyApi(x, y, function(err, value) { ... });\n\t *   // new-style caller, such hotness:\n\t *   return legacyApi(x, y).then(function(value) { ... });\n\t * ```\n\t * The optional `names` parameter to `mkPromised` is the same\n\t * as the optional second argument to `Promise.promisify` in\n\t * {@link https://github/cscott/prfun}.\n\t * It allows the use of `mkPromised` for legacy functions which\n\t * promise multiple results to their callbacks, eg:\n\t * ```\n\t *   callback(err, body, response);  // from npm \"request\" module\n\t * ```\n\t * For this callback signature, you have two options:\n\t * 1. Pass `true` as the names parameter:\n\t *    ```\n\t *      function legacyRequest(options, callback) {\n\t *        callback = JSUtils.mkPromised(callback, true);\n\t *        ... existing implementation...\n\t *        return callback.promise;\n\t *      }\n\t *    ```\n\t *    This resolves the promise with the array `[body, response]`, so\n\t *    a Promise-using caller looks like:\n\t *    ```\n\t *      return legacyRequest(options).then(function(r) {\n\t *        var body = r[0], response = r[1];\n\t *        ...\n\t *      }\n\t *    ```\n\t *    If you are using `prfun` then `Promise#spread` is convenient:\n\t *    ```\n\t *      return legacyRequest(options).spread(function(body, response) {\n\t *        ...\n\t *      });\n\t *    ```\n\t * 2. Alternatively (and probably preferably), provide an array of strings\n\t *    as the `names` parameter:\n\t *    ```\n\t *      function legacyRequest(options, callback) {\n\t *        callback = JSUtils.mkPromised(callback, ['body','response']);\n\t *        ... existing implementation...\n\t *        return callback.promise;\n\t *      }\n\t *    ```\n\t *    The resolved value will be an object with those fields:\n\t *    ```\n\t *      return legacyRequest(options).then(function(r) {\n\t *        var body = r.body, response = r.response;\n\t *        ...\n\t *      }\n\t *    ```\n\t * Note that in both cases the legacy callback behavior is unchanged:\n\t * ```\n\t *   legacyRequest(options, function(err, body, response) { ... });\n\t * ```\n\t *\n\t * @param {Function|undefined} callback\n\t * @param {true|Array<string>} [names]\n\t * @return {Function}\n\t * @return {Promise} [return.promise] A promise that will be fulfilled\n\t *  when the returned callback function is invoked.\n\t */\n\tmkPromised: function(callback, names) {\n\t\tvar res, rej;\n\t\tvar p = new Promise(function(_res, _rej) { res = _res; rej = _rej; });\n\t\tvar f = function(e, v) {\n\t\t\tif (e) {\n\t\t\t\trej(e);\n\t\t\t} else if (names === true) {\n\t\t\t\tres(Array.prototype.slice.call(arguments, 1));\n\t\t\t} else if (names) {\n\t\t\t\tvar value = {};\n\t\t\t\tfor (var index in names) {\n\t\t\t\t\tvalue[names[index]] = arguments[(+index) + 1];\n\t\t\t\t}\n\t\t\t\tres(value);\n\t\t\t} else {\n\t\t\t\tres(v);\n\t\t\t}\n\t\t\treturn callback && callback.apply(this, arguments);\n\t\t};\n\t\tf.promise = p;\n\t\treturn f;\n\t},\n\n\t/**\n\t * Determine whether two objects are identical, recursively.\n\t *\n\t * @param {any} a\n\t * @param {any} b\n\t * @return {boolean}\n\t */\n\tdeepEquals: function(a, b) {\n\t\tvar i;\n\t\tif (a === b) {\n\t\t\t// If only it were that simple.\n\t\t\treturn true;\n\t\t}\n\n\t\tif (a === undefined || b === undefined ||\n\t\t\t\ta === null || b === null) {\n\t\t\treturn false;\n\t\t}\n\n\t\tif (a.constructor !== b.constructor) {\n\t\t\treturn false;\n\t\t}\n\n\t\tif (a instanceof Object) {\n\t\t\tfor (i in a) {\n\t\t\t\tif (!this.deepEquals(a[i], b[i])) {\n\t\t\t\t\treturn false;\n\t\t\t\t}\n\t\t\t}\n\t\t\tfor (i in b) {\n\t\t\t\tif (a[i] === undefined) {\n\t\t\t\t\treturn false;\n\t\t\t\t}\n\t\t\t}\n\t\t\treturn true;\n\t\t}\n\n\t\treturn false;\n\t},\n\n\t/**\n\t * Return accurate system time\n\t *\n\t * @return {number}\n\t */\n\tstartTime: function() {\n\t\tvar startHrTime = process.hrtime();\n\t\tvar milliseconds = (startHrTime[0] * 1e9 + startHrTime[1]) / 1000000;\t// convert seconds and nanoseconds to a scalar milliseconds value\n\t\treturn milliseconds;\n\t},\n\n\t/**\n\t * Return millisecond accurate system time differential\n\t *\n\t * @param {number} previousTime\n\t * @return {number}\n\t */\n\telapsedTime: function(previousTime) {\n\t\tvar endHrTime = process.hrtime();\n\t\tvar milliseconds = (endHrTime[0] * 1e9 + endHrTime[1]) / 1000000;\t// convert seconds and nanoseconds to a scalar milliseconds value\n\t\treturn milliseconds - previousTime;\n\t},\n\n};\n\nif (typeof module === \"object\") {\n\tmodule.exports.JSUtils = JSUtils;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/utils/promise.js","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/wt2html/XMLSerializer.js","messages":[{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":241,"column":1,"nodeType":"Block","endLine":249,"endColumn":4},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":244,"column":null,"nodeType":"Block","endLine":244,"endColumn":null},{"ruleId":"no-shadow","severity":2,"message":"'node' is already declared in the upper scope.","line":260,"column":9,"nodeType":"Identifier","messageId":"noShadow","endLine":260,"endColumn":13}],"errorCount":1,"warningCount":2,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/**\n * Stand-alone XMLSerializer for DOM3 documents\n *\n * The output is identical to standard XHTML5 DOM serialization, as given by\n * http://www.w3.org/TR/html-polyglot/\n * and\n * https://html.spec.whatwg.org/multipage/syntax.html#serialising-html-fragments\n * except that we may quote attributes with single quotes, *only* where that would\n * result in more compact output than the standard double-quoted serialization.\n *\n * @module\n */\n\n'use strict';\n\nconst entities = require('entities');\nconst { DOMUtils } = require('../utils/DOMUtils.js');\nconst { JSUtils } = require('../utils/jsutils.js');\nconst { WTUtils } = require('../utils/WTUtils.js');\nconst { WikitextConstants } = require('../config/WikitextConstants.js');\n\n// nodeType constants\nvar ELEMENT_NODE = 1;\nvar TEXT_NODE = 3;\nvar COMMENT_NODE = 8;\nvar DOCUMENT_NODE = 9;\nvar DOCUMENT_FRAGMENT_NODE = 11;\n\n/**\n * HTML5 void elements\n *\n * @namespace\n * @private\n */\nvar emptyElements = {\n\tarea: true,\n\tbase: true,\n\tbasefont: true,\n\tbgsound: true,\n\tbr: true,\n\tcol: true,\n\tcommand: true,\n\tembed: true,\n\tframe: true,\n\thr: true,\n\timg: true,\n\tinput: true,\n\tkeygen: true,\n\tlink: true,\n\tmeta: true,\n\tparam: true,\n\tsource: true,\n\ttrack: true,\n\twbr: true,\n};\n\n/**\n * HTML5 elements with raw (unescaped) content\n *\n * @namespace\n * @private\n */\nvar hasRawContent = {\n\tstyle: true,\n\tscript: true,\n\txmp: true,\n\tiframe: true,\n\tnoembed: true,\n\tnoframes: true,\n\tplaintext: true,\n\tnoscript: true,\n};\n\n/**\n * Elements that strip leading newlines\n * http://www.whatwg.org/specs/web-apps/current-work/multipage/the-end.html#html-fragment-serialization-algorithm\n *\n * @namespace\n * @private\n */\nvar newlineStrippingElements = {\n\tpre: true,\n\ttextarea: true,\n\tlisting: true,\n};\n\n/**\n * @namespace\n */\nvar XMLSerializer = {};\n\nfunction serializeToString(node, options, accum) {\n\tvar child;\n\tif (options.tunnelFosteredContent && WikitextConstants.HTML.FosterablePosition.has(node.nodeName)) {\n\t\t// Tunnel fosterable metas as comments.\n\t\t// This is analogous to what is done when treebuilding.\n\t\tconst ownerDoc = node.ownerDocument;\n\t\tconst allowedTags = WikitextConstants.HTML.TableContentModels.get(node.nodeName);\n\t\tchild = node.firstChild;\n\t\twhile (child) {\n\t\t\tconst next = child.nextSibling;\n\t\t\tif (DOMUtils.isText(child)) {\n\t\t\t\tconsole.assert(DOMUtils.isIEW(child), 'Only expecting whitespace!');\n\t\t\t} else if (DOMUtils.isElt(child) && !allowedTags.includes(child.nodeName)) {\n\t\t\t\tconsole.assert(child.nodeName === 'META', 'Only fosterable metas expected!');\n\t\t\t\tconst comment = WTUtils.fosterCommentData(\n\t\t\t\t\tchild.getAttribute('typeof'),\n\t\t\t\t\tArray.from(child.attributes).map(a => [a.name, a.value]),\n\t\t\t\t\ttrue\n\t\t\t\t);\n\t\t\t\tnode.replaceChild(ownerDoc.createComment(comment), child);\n\t\t\t}\n\t\t\tchild = next;\n\t\t}\n\t}\n\tswitch (node.nodeType) {\n\t\tcase ELEMENT_NODE:\n\t\t\tchild = node.firstChild;\n\t\t\tvar attrs = node.attributes;\n\t\t\tvar len = attrs.length;\n\t\t\tvar nodeName = node.tagName.toLowerCase();\n\t\t\tvar localName = node.localName;\n\t\t\taccum('<' + localName, node);\n\t\t\tfor (var i = 0; i < len; i++) {\n\t\t\t\tvar attr = attrs.item(i);\n\t\t\t\tif (options.smartQuote &&\n\t\t\t\t\t\t// More double quotes than single quotes in value?\n\t\t\t\t\t\t(attr.value.match(/\"/g) || []).length >\n\t\t\t\t\t\t(attr.value.match(/'/g) || []).length) {\n\t\t\t\t\t// use single quotes\n\t\t\t\t\taccum(' ' + attr.name + \"='\"\n\t\t\t\t\t\t\t+ attr.value.replace(/[<&']/g, entities.encodeHTML5) + \"'\",\n\t\t\t\t\t\t\tnode);\n\t\t\t\t} else {\n\t\t\t\t\t// use double quotes\n\t\t\t\t\taccum(' ' + attr.name + '=\"'\n\t\t\t\t\t\t\t+ attr.value.replace(/[<&\"]/g, entities.encodeHTML5) + '\"',\n\t\t\t\t\t\t\tnode);\n\t\t\t\t}\n\t\t\t}\n\t\t\tif (child || !emptyElements[nodeName]) {\n\t\t\t\taccum('>', node, 'start');\n\t\t\t\t// if is cdata child node\n\t\t\t\tif (hasRawContent[nodeName]) {\n\t\t\t\t\t// TODO: perform context-sensitive escaping?\n\t\t\t\t\t// Currently this content is not normally part of our DOM, so\n\t\t\t\t\t// no problem. If it was, we'd probably have to do some\n\t\t\t\t\t// tag-specific escaping. Examples:\n\t\t\t\t\t// * < to \\u003c in <script>\n\t\t\t\t\t// * < to \\3c in <style>\n\t\t\t\t\t// ...\n\t\t\t\t\tif (child) {\n\t\t\t\t\t\taccum(child.data, node);\n\t\t\t\t\t}\n\t\t\t\t} else {\n\t\t\t\t\tif (child && newlineStrippingElements[localName]\n\t\t\t\t\t\t\t&& child.nodeType === TEXT_NODE && /^\\n/.test(child.data)) {\n\t\t\t\t\t\t/* If current node is a pre, textarea, or listing element,\n\t\t\t\t\t\t * and the first child node of the element, if any, is a\n\t\t\t\t\t\t * Text node whose character data has as its first\n\t\t\t\t\t\t * character a U+000A LINE FEED (LF) character, then\n\t\t\t\t\t\t * append a U+000A LINE FEED (LF) character. */\n\t\t\t\t\t\taccum('\\n', node);\n\t\t\t\t\t}\n\t\t\t\t\twhile (child) {\n\t\t\t\t\t\tserializeToString(child, options, accum);\n\t\t\t\t\t\tchild = child.nextSibling;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t\taccum('</' + localName + '>', node, 'end');\n\t\t\t} else {\n\t\t\t\taccum('/>', node, 'end');\n\t\t\t}\n\t\t\treturn;\n\t\tcase DOCUMENT_NODE:\n\t\tcase DOCUMENT_FRAGMENT_NODE:\n\t\t\tchild = node.firstChild;\n\t\t\twhile (child) {\n\t\t\t\tserializeToString(child, options, accum);\n\t\t\t\tchild = child.nextSibling;\n\t\t\t}\n\t\t\treturn;\n\t\tcase TEXT_NODE:\n\t\t\treturn accum(node.data.replace(/[<&]/g, entities.encodeHTML5), node);\n\t\tcase COMMENT_NODE:\n\t\t\t// According to\n\t\t\t// http://www.w3.org/TR/DOM-Parsing/#dfn-concept-serialize-xml\n\t\t\t// we could throw an exception here if node.data would not create\n\t\t\t// a \"well-formed\" XML comment.  But we use entity encoding when\n\t\t\t// we create the comment node to ensure that node.data will always\n\t\t\t// be okay; see WTUtils.encodeComment().\n\t\t\treturn accum('<!--' + node.data + '-->', node);\n\t\tdefault:\n\t\t\taccum('??' + node.nodeName, node);\n\t}\n}\n\nvar accumOffsets = function(out, bit, node, flag) {\n\tif (DOMUtils.isBody(node)) {\n\t\tout.html += bit;\n\t\tif (flag === 'start') {\n\t\t\tout.start = out.html.length;\n\t\t} else if (flag === 'end') {\n\t\t\tout.start = null;\n\t\t\tout.uid = null;\n\t\t}\n\t} else if (!DOMUtils.isElt(node) || out.start === null || !DOMUtils.isBody(node.parentNode)) {\n\t\t// In case you're wondering, out.start may never be set if body\n\t\t// isn't a child of the node passed to serializeToString, or if it\n\t\t// is the node itself but options.innerXML is true.\n\t\tout.html += bit;\n\t\tif (out.uid !== null) {\n\t\t\tout.offsets[out.uid].html[1] += bit.length;\n\t\t}\n\t} else {\n\t\tvar newUid = node.hasAttribute('id') ? node.getAttribute('id') : null;\n\t\t// Encapsulated siblings don't have generated ids (but may have an id),\n\t\t// so associate them with preceding content.\n\t\tif (newUid && newUid !== out.uid && !out.last) {\n\t\t\tif (!WTUtils.isEncapsulationWrapper(node)) {\n\t\t\t\tout.uid = newUid;\n\t\t\t} else if (WTUtils.isFirstEncapsulationWrapperNode(node)) {\n\t\t\t\tvar about = node.getAttribute('about');\n\t\t\t\tout.last = JSUtils.lastItem(WTUtils.getAboutSiblings(node, about));\n\t\t\t\tout.uid = newUid;\n\t\t\t}\n\t\t}\n\t\tif (out.last === node && flag === \"end\") {\n\t\t\tout.last = null;\n\t\t}\n\t\tconsole.assert(out.uid !== null);\n\t\tif (!out.offsets.hasOwnProperty(out.uid)) {\n\t\t\tvar dt = out.html.length - out.start;\n\t\t\tout.offsets[out.uid] = { html: [dt, dt] };\n\t\t}\n\t\tout.html += bit;\n\t\tout.offsets[out.uid].html[1] += bit.length;\n\t}\n};\n\n/**\n * Serialize an HTML DOM3 node to XHTML.\n *\n * @param {Node} node\n * @param {Object} [options]\n * @param {boolean} [options.smartQuote=true]\n * @param {boolean} [options.innerXML=false]\n * @param {boolean} [options.captureOffsets=false]\n */\nXMLSerializer.serialize = function(node, options) {\n\tif (!options) { options = {}; }\n\tif (!options.hasOwnProperty('smartQuote')) {\n\t\toptions.smartQuote = true;\n\t}\n\tif (node.nodeName === '#document') {\n\t\tnode = node.documentElement;\n\t}\n\tvar out = { html: '', offsets: {}, start: null, uid: null, last: null };\n\tvar accum = options.captureOffsets ?\n\t\t(bit, node, flag) => accumOffsets(out, bit, node, flag) : (bit) => { out.html += bit; };\n\tif (options.innerXML) {\n\t\tfor (var child = node.firstChild; child; child = child.nextSibling) {\n\t\t\tserializeToString(child, options, accum);\n\t\t}\n\t} else {\n\t\tserializeToString(node, options, accum);\n\t}\n\t// Ensure there's a doctype for documents.\n\tif (!options.innerXML && /^html$/i.test(node.nodeName)) {\n\t\tout.html = '<!DOCTYPE html>\\n' + out.html;\n\t}\n\t// Drop the bookkeeping\n\tconst bookkeeping = { start: undefined, uid: undefined, last: undefined };\n\tif (!options.captureOffsets) { bookkeeping.offsets = undefined; }\n\tObject.assign(out, bookkeeping);\n\treturn out;\n};\n\n\nmodule.exports = XMLSerializer;\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/lib/wt2html/tokenizer.js","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/localsettings.example.js","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/package-lock.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/package.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/src/Config/variants.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/MockEnv.js","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/TestUtils.js","messages":[{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":171,"column":null,"nodeType":"Block","endLine":171,"endColumn":null},{"ruleId":"jsdoc/no-undefined-types","severity":1,"message":"The type 'Node' is undefined.","line":176,"column":null,"nodeType":"Block","endLine":176,"endColumn":null},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":283,"column":1,"nodeType":"Block","endLine":287,"endColumn":4},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"html\" type.","line":286,"column":null,"nodeType":"Block","endLine":286,"endColumn":null},{"ruleId":"no-process-exit","severity":2,"message":"Don't use process.exit(); throw an error instead.","line":999,"column":3,"nodeType":"CallExpression","messageId":"noProcessExit","endLine":999,"endColumn":18},{"ruleId":"no-process-exit","severity":2,"message":"Don't use process.exit(); throw an error instead.","line":1015,"column":5,"nodeType":"CallExpression","messageId":"noProcessExit","endLine":1015,"endColumn":20}],"errorCount":2,"warningCount":4,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/**\n * @module\n */\n\n'use strict';\n\nrequire('../core-upgrade.js');\n\nvar colors = require('colors');\nvar entities = require('entities');\nvar yargs = require('yargs');\n\nvar Diff = require('../lib/utils/Diff.js').Diff;\nvar ContentUtils = require('../lib/utils/ContentUtils.js').ContentUtils;\nvar DOMUtils = require('../lib/utils/DOMUtils.js').DOMUtils;\nvar DOMDataUtils = require('../lib/utils/DOMDataUtils.js').DOMDataUtils;\nvar ScriptUtils = require('../tools/ScriptUtils.js').ScriptUtils;\nvar Util = require('../lib/utils/Util.js').Util;\nvar WTUtils = require('../lib/utils/WTUtils.js').WTUtils;\nvar DOMNormalizer = require('../lib/html2wt/DOMNormalizer.js').DOMNormalizer;\nvar MockEnv = require('./MockEnv.js').MockEnv;\nvar JSUtils = require('../lib/utils/jsutils.js').JSUtils;\n\nvar TestUtils = {};\n\n/**\n * Little helper function for encoding XML entities.\n *\n * @param {string} string\n * @return {string}\n */\nTestUtils.encodeXml = function(string) {\n\treturn entities.encodeXML(string);\n};\n\n/**\n * Specialized normalization of the PHP parser & Parsoid output, to ignore\n * a few known-ok differences in parser test runs.\n *\n * This code is also used by the Parsoid round-trip testing code.\n *\n * If parsoidOnly is true-ish, we allow more markup through (like property\n * and typeof attributes), for better checking of parsoid-only test cases.\n *\n * @param {string} domBody\n * @param {Object} options\n * @param {boolean} [options.parsoidOnly=false]\n * @param {boolean} [options.preserveIEW=false]\n * @param {boolean} [options.scrubWikitext=false]\n * @return {string}\n */\nTestUtils.normalizeOut = function(domBody, options) {\n\tif (!options) {\n\t\toptions = {};\n\t}\n\tconst parsoidOnly = options.parsoidOnly;\n\tconst preserveIEW = options.preserveIEW;\n\n\tif (options.scrubWikitext) {\n\t\t// Mock env obj\n\t\t//\n\t\t// FIXME: This is ugly.\n\t\t// (a) The normalizer shouldn't need the full env.\n\t\t//     Pass options and a logger instead?\n\t\t// (b) DOM diff code is using page-id for some reason.\n\t\t//     That feels like a carryover of 2013 era code.\n\t\t//     If possible, get rid of it and diff-mark dependency\n\t\t//     on the env object.\n\t\tconst env = new MockEnv({ scrubWikitext: true }, null);\n\t\tif (typeof (domBody) === 'string') {\n\t\t\tdomBody = env.createDocument(domBody).body;\n\t\t}\n\t\tvar mockState = {\n\t\t\tenv,\n\t\t\tselserMode: false,\n\t\t};\n\t\tDOMDataUtils.visitAndLoadDataAttribs(domBody, { markNew: true });\n\t\tdomBody = (new DOMNormalizer(mockState).normalize(domBody));\n\t\tDOMDataUtils.visitAndStoreDataAttribs(domBody);\n\t} else {\n\t\tif (typeof (domBody) === 'string') {\n\t\t\tdomBody = DOMUtils.parseHTML(domBody).body;\n\t\t}\n\t}\n\n\tvar stripTypeof = parsoidOnly ?\n\t\t/^mw:Placeholder$/ :\n\t\t/^mw:(?:DisplaySpace|Placeholder|Nowiki|Transclusion|Entity)$/;\n\tdomBody = this.unwrapSpansAndNormalizeIEW(domBody, stripTypeof, parsoidOnly, preserveIEW);\n\tvar out = ContentUtils.toXML(domBody, { innerXML: true });\n\t// NOTE that we use a slightly restricted regexp for \"attribute\"\n\t//  which works for the output of DOM serialization.  For example,\n\t//  we know that attribute values will be surrounded with double quotes,\n\t//  not unquoted or quoted with single quotes.  The serialization\n\t//  algorithm is given by:\n\t//  http://www.whatwg.org/specs/web-apps/current-work/multipage/the-end.html#serializing-html-fragments\n\tif (!/[^<]*(<\\w+(\\s+[^\\0-\\cZ\\s\"'>\\/=]+(=\"[^\"]*\")?)*\\/?>[^<]*)*/.test(out)) {\n\t\tthrow new Error(\"normalizeOut input is not in standard serialized form\");\n\t}\n\n\t// Eliminate a source of indeterminacy from leaked strip markers\n\tout = out.replace(/UNIQ-.*?-QINU/g, '');\n\n\t// Normalize COINS ids -- they aren't stable\n\tout = out.replace(/\\s?id=['\"]coins_\\d+['\"]/ig, '');\n\n\t// Eliminate transience from priority hints (T216499)\n\tout = out.replace(/\\s?importance=\"high\"/g, '');\n\tout = out.replace(/\\s?elementtiming=\"thumbnail-(high|top)\"/g, '');\n\n\t// maplink extension\n\tout = out.replace(/\\s?data-overlays='[^']*'/ig, '');\n\n\tif (parsoidOnly) {\n\t\t// unnecessary attributes, we don't need to check these\n\t\t// style is in there because we should only check classes.\n\t\tout = out.replace(/ (data-parsoid|prefix|about|rev|datatype|inlist|usemap|vocab|content|style)=\\\\?\"[^\\\"]*\\\\?\"/g, '');\n\t\t// single-quoted variant\n\t\tout = out.replace(/ (data-parsoid|prefix|about|rev|datatype|inlist|usemap|vocab|content|style)=\\\\?'[^\\']*\\\\?'/g, '');\n\t\t// apos variant\n\t\tout = out.replace(/ (data-parsoid|prefix|about|rev|datatype|inlist|usemap|vocab|content|style)=&apos;.*?&apos;/g, '');\n\n\t\t// strip self-closed <nowiki /> because we frequently test WTS\n\t\t// <nowiki> insertion by providing an html/parsoid section with the\n\t\t// <meta> tags stripped out, allowing the html2wt test to verify that\n\t\t// the <nowiki> is correctly added during WTS, while still allowing\n\t\t// the html2html and wt2html versions of the test to pass as a\n\t\t// sanity check.  If <meta>s were not stripped, these tests would all\n\t\t// have to be modified and split up.  Not worth it at this time.\n\t\t// (see commit 689b22431ad690302420d049b10e689de6b7d426)\n\t\tout = out\n\t\t\t.replace(/<span typeof=\"mw:Nowiki\"><\\/span>/g, '');\n\n\t\treturn out;\n\t}\n\n\t// Normalize headings by stripping out Parsoid-added ids so that we don't\n\t// have to add these ids to every parser test that uses headings.\n\t// We will test the id generation scheme separately via mocha tests.\n\tout = out.replace(/(<h[1-6].*?) id=\"[^\"]*\"([^>]*>)/g, '$1$2');\n\n\t// strip meta/link elements\n\tout = out\n\t\t.replace(/<\\/?(?:meta|link)(?: [^\\0-\\cZ\\s\"'>\\/=]+(?:=(?:\"[^\"]*\"|'[^']*'))?)*\\/?>/g, '');\n\t// Ignore troublesome attributes.\n\t// Strip JSON attributes like data-mw and data-parsoid early so that\n\t// comment stripping in normalizeNewlines does not match unbalanced\n\t// comments in wikitext source.\n\tout = out.replace(/ (data-mw|data-parsoid|resource|rel|prefix|about|rev|datatype|inlist|property|usemap|vocab|content|class)=\\\\?\"[^\\\"]*\\\\?\"/g, '');\n\t// single-quoted variant\n\tout = out.replace(/ (data-mw|data-parsoid|resource|rel|prefix|about|rev|datatype|inlist|property|usemap|vocab|content|class)=\\\\?'[^\\']*\\\\?'/g, '');\n\t// strip typeof last\n\tout = out.replace(/ typeof=\"[^\\\"]*\"/g, '');\n\n\treturn out\n\t\t// replace mwt ids\n\t\t.replace(/ id=\"mw((t\\d+)|([\\w-]{2,}))\"/g, '')\n\t\t.replace(/<span[^>]+about=\"[^\"]*\"[^>]*>/g, '')\n\t\t.replace(/(\\s)<span>\\s*<\\/span>\\s*/g, '$1')\n\t\t.replace(/<span>\\s*<\\/span>/g, '')\n\t\t.replace(/(href=\")(?:\\.?\\.\\/)+/g, '$1')\n\t\t// replace unnecessary URL escaping\n\t\t.replace(/ href=\"[^\"]*\"/g, Util.decodeURI)\n\t\t// strip thumbnail size prefixes\n\t\t.replace(/(src=\"[^\"]*?)\\/thumb(\\/[0-9a-f]\\/[0-9a-f]{2}\\/[^\\/]+)\\/[0-9]+px-[^\"\\/]+(?=\")/g, '$1$2');\n};\n\n/**\n * Normalize newlines in IEW to spaces instead.\n *\n * @param {Node} body\n *   The document `<body>` node to normalize.\n * @param {RegExp} [stripSpanTypeof]\n * @param {boolean} [parsoidOnly=false]\n * @param {boolean} [preserveIEW=false]\n * @return {Node}\n */\nTestUtils.unwrapSpansAndNormalizeIEW = function(body, stripSpanTypeof, parsoidOnly, preserveIEW) {\n\tvar newlineAround = function(node) {\n\t\treturn node && /^(BODY|CAPTION|DIV|DD|DT|LI|P|TABLE|TR|TD|TH|TBODY|DL|OL|UL|H[1-6])$/.test(node.nodeName);\n\t};\n\tvar unwrapSpan;  // forward declare\n\tvar cleanSpans = function(node) {\n\t\tvar child, next;\n\t\tif (!stripSpanTypeof) { return; }\n\t\tfor (child = node.firstChild; child; child = next) {\n\t\t\tnext = child.nextSibling;\n\t\t\tif (child.nodeName === 'SPAN' &&\n\t\t\t\tstripSpanTypeof.test(child.getAttribute('typeof') || '')) {\n\t\t\t\tunwrapSpan(node, child);\n\t\t\t}\n\t\t}\n\t};\n\tunwrapSpan = function(parent, node) {\n\t\t// first recurse to unwrap any spans in the immediate children.\n\t\tcleanSpans(node);\n\t\t// now unwrap this span.\n\t\tDOMUtils.migrateChildren(node, parent, node);\n\t\tparent.removeChild(node);\n\t};\n\tvar visit = function(node, stripLeadingWS, stripTrailingWS, inPRE) {\n\t\tvar child, next, prev;\n\t\tif (node.nodeName === 'PRE') {\n\t\t\t// Preserve newlines in <pre> tags\n\t\t\tinPRE = true;\n\t\t}\n\t\tif (!preserveIEW && DOMUtils.isText(node)) {\n\t\t\tif (!inPRE) {\n\t\t\t\tnode.data = node.data.replace(/\\s+/g, ' ');\n\t\t\t}\n\t\t\tif (stripLeadingWS) {\n\t\t\t\tnode.data = node.data.replace(/^\\s+/, '');\n\t\t\t}\n\t\t\tif (stripTrailingWS) {\n\t\t\t\tnode.data = node.data.replace(/\\s+$/, '');\n\t\t\t}\n\t\t}\n\t\t// unwrap certain SPAN nodes\n\t\tcleanSpans(node);\n\t\t// now remove comment nodes\n\t\tif (!parsoidOnly) {\n\t\t\tfor (child = node.firstChild; child; child = next) {\n\t\t\t\tnext = child.nextSibling;\n\t\t\t\tif (DOMUtils.isComment(child)) {\n\t\t\t\t\tnode.removeChild(child);\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t\t// reassemble text nodes split by a comment or span, if necessary\n\t\tnode.normalize();\n\t\t// now recurse.\n\t\tif (node.nodeName === 'PRE') {\n\t\t\t// hack, since PHP adds a newline before </pre>\n\t\t\tstripLeadingWS = false;\n\t\t\tstripTrailingWS = true;\n\t\t} else if (node.nodeName === 'SPAN' &&\n\t\t\t\t/^mw[:]/.test(node.getAttribute('typeof') || '')) {\n\t\t\t// SPAN is transparent; pass the strip parameters down to kids\n\t\t} else {\n\t\t\tstripLeadingWS = stripTrailingWS = newlineAround(node);\n\t\t}\n\t\tchild = node.firstChild;\n\t\t// Skip over the empty mw:FallbackId <span> and strip leading WS\n\t\t// on the other side of it.\n\t\tif (/^H[1-6]$/.test(node.nodeName) &&\n\t\t\tchild && WTUtils.isFallbackIdSpan(child)) {\n\t\t\tchild = child.nextSibling;\n\t\t}\n\t\tfor (; child; child = next) {\n\t\t\tnext = child.nextSibling;\n\t\t\tvisit(child,\n\t\t\t\tstripLeadingWS,\n\t\t\t\tstripTrailingWS && !child.nextSibling,\n\t\t\t\tinPRE);\n\t\t\tstripLeadingWS = false;\n\t\t}\n\t\tif (inPRE || preserveIEW) { return node; }\n\t\t// now add newlines around appropriate nodes.\n\t\tfor (child = node.firstChild; child; child = next) {\n\t\t\tprev = child.previousSibling;\n\t\t\tnext = child.nextSibling;\n\t\t\tif (newlineAround(child)) {\n\t\t\t\tif (prev && DOMUtils.isText(prev)) {\n\t\t\t\t\tprev.data = prev.data.replace(/\\s*$/, '\\n');\n\t\t\t\t} else {\n\t\t\t\t\tprev = node.ownerDocument.createTextNode('\\n');\n\t\t\t\t\tnode.insertBefore(prev, child);\n\t\t\t\t}\n\t\t\t\tif (next && DOMUtils.isText(next)) {\n\t\t\t\t\tnext.data = next.data.replace(/^\\s*/, '\\n');\n\t\t\t\t} else {\n\t\t\t\t\tnext = node.ownerDocument.createTextNode('\\n');\n\t\t\t\t\tnode.insertBefore(next, child.nextSibling);\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t\treturn node;\n\t};\n\t// clone body first, since we're going to destructively mutate it.\n\treturn visit(body.cloneNode(true), true, true, false);\n};\n\n/**\n * Strip some php output we aren't generating.\n *\n * @param html\n */\nTestUtils.normalizePhpOutput = function(html) {\n\treturn html\n\t\t// do not expect section editing for now\n\t\t.replace(/<span[^>]+class=\"mw-headline\"[^>]*>(.*?)<\\/span> *(<span class=\"mw-editsection\"><span class=\"mw-editsection-bracket\">\\[<\\/span>.*?<span class=\"mw-editsection-bracket\">\\]<\\/span><\\/span>)?/g, '$1')\n\t\t.replace(/<a[^>]+class=\"mw-headline-anchor\"[^>]*>§<\\/a>/g, '');\n};\n\n/**\n * Normalize the expected parser output by parsing it using a HTML5 parser and\n * re-serializing it to HTML. Ideally, the parser would normalize inter-tag\n * whitespace for us. For now, we fake that by simply stripping all newlines.\n *\n * @param {string} source\n * @return {string}\n */\nTestUtils.normalizeHTML = function(source) {\n\ttry {\n\t\tvar body = this.unwrapSpansAndNormalizeIEW(DOMUtils.parseHTML(source).body);\n\t\tvar html = ContentUtils.toXML(body, { innerXML: true })\n\t\t\t// a few things we ignore for now..\n\t\t\t//  .replace(/\\/wiki\\/Main_Page/g, 'Main Page')\n\t\t\t// do not expect a toc for now\n\t\t\t.replace(/<div[^>]+?id=\"toc\"[^>]*>\\s*<div id=\"toctitle\"[^>]*>[\\s\\S]+?<\\/div>[\\s\\S]+?<\\/div>\\s*/g, '');\n\t\treturn this.normalizePhpOutput(html)\n\t\t\t// remove empty span tags\n\t\t\t.replace(/(\\s)<span>\\s*<\\/span>\\s*/g, '$1')\n\t\t\t.replace(/<span>\\s*<\\/span>/g, '')\n\t\t\t// general class and titles, typically on links\n\t\t\t.replace(/ (class|rel|about|typeof)=\"[^\"]*\"/g, '')\n\t\t\t// strip red link markup, we do not check if a page exists yet\n\t\t\t.replace(/\\/index.php\\?title=([^']+?)&amp;action=edit&amp;redlink=1/g, '/wiki/$1')\n\t\t\t// strip red link title info\n\t\t\t.replace(/ \\((?:page does not exist|encara no existeix|bet ele jaratılmag'an|lonkásá  ezalí tɛ̂)\\)/g, '')  // eslint-disable-line\n\t\t\t// the expected html has some extra space in tags, strip it\n\t\t\t.replace(/<a +href/g, '<a href')\n\t\t\t.replace(/href=\"\\/wiki\\//g, 'href=\"')\n\t\t\t.replace(/\" +>/g, '\">')\n\t\t\t// parsoid always add a page name to lonely fragments\n\t\t\t.replace(/href=\"#/g, 'href=\"Main Page#')\n\t\t\t// replace unnecessary URL escaping\n\t\t\t.replace(/ href=\"[^\"]*\"/g, Util.decodeURI)\n\t\t\t// strip empty spans\n\t\t\t.replace(/(\\s)<span>\\s*<\\/span>\\s*/g, '$1')\n\t\t\t.replace(/<span>\\s*<\\/span>/g, '');\n\t} catch (e) {\n\t\tconsole.log(\"normalizeHTML failed on\" +\n\t\t\tsource + \" with the following error: \" + e);\n\t\tconsole.trace();\n\t\treturn source;\n\t}\n};\n\n/**\n * Colorize given number if <> 0.\n *\n * @param {number} count\n * @param {string} color\n * @return {string} Colorized count\n */\nvar colorizeCount = function(count, color) {\n\t// We need a string to use colors methods\n\tvar s = count.toString();\n\tif (count === 0 || !s[color]) {\n\t\treturn s;\n\t}\n\treturn s[color] + '';\n};\n\n/**\n * @param {Array} modesRan\n * @param {Object} stats\n * @param {number} stats.failedTests Number of failed tests due to differences in output.\n * @param {number} stats.passedTests Number of tests passed without any special consideration.\n * @param {Object} stats.modes All of the stats (failedTests and passedTests) per-mode.\n * @param {string} file\n * @param {number} loggedErrorCount\n * @param {RegExp|null} testFilter\n * @param {boolean} blacklistChanged\n * @return {number} The number of failures.\n */\nvar reportSummary = function(modesRan, stats, file, loggedErrorCount, testFilter, blacklistChanged) {\n\tvar curStr, mode, thisMode;\n\tvar failTotalTests = stats.failedTests;\n\tvar happiness = (\n\t\tstats.passedTestsUnexpected === 0 && stats.failedTestsUnexpected === 0\n\t);\n\tvar filename = (file === null) ? \"ALL TESTS\" : file;\n\n\tif (file === null) { console.log(); }\n\tconsole.log(\"==========================================================\");\n\tconsole.log(\"SUMMARY:\", happiness ? filename.green : filename.red);\n\tif (console.time && console.timeEnd && file !== null) {\n\t\tconsole.timeEnd('Execution time');\n\t}\n\n\tif (failTotalTests !== 0) {\n\t\tfor (var i = 0; i < modesRan.length; i++) {\n\t\t\tmode = modesRan[i];\n\t\t\tcurStr = mode + ': ';\n\t\t\tthisMode = stats.modes[mode];\n\t\t\tcurStr += colorizeCount(thisMode.passedTests, 'green') + ' passed (';\n\t\t\tcurStr += colorizeCount(thisMode.passedTestsUnexpected, 'red') + ' unexpected) / ';\n\t\t\tcurStr += colorizeCount(thisMode.failedTests, 'red') + ' failed (';\n\t\t\tcurStr += colorizeCount(thisMode.failedTestsUnexpected, 'red') + ' unexpected)';\n\t\t\tconsole.log(curStr);\n\t\t}\n\n\t\tcurStr = 'TOTAL' + ': ';\n\t\tcurStr += colorizeCount(stats.passedTests, 'green') + ' passed (';\n\t\tcurStr += colorizeCount(stats.passedTestsUnexpected, 'red') + ' unexpected) / ';\n\t\tcurStr += colorizeCount(stats.failedTests, 'red') + ' failed (';\n\t\tcurStr += colorizeCount(stats.failedTestsUnexpected, 'red') + ' unexpected)';\n\t\tconsole.log(curStr);\n\n\t\tif (file === null) {\n\t\t\tconsole.log(colorizeCount(stats.passedTests, 'green') +\n\t\t\t\t' total passed tests (expected ' +\n\t\t\t\t(stats.passedTests - stats.passedTestsUnexpected + stats.failedTestsUnexpected) +\n\t\t\t\t'), ' +\n\t\t\t\tcolorizeCount(failTotalTests , 'red') + ' total failures (expected ' +\n\t\t\t\t(stats.failedTests - stats.failedTestsUnexpected + stats.passedTestsUnexpected) +\n\t\t\t\t')');\n\t\t}\n\t} else {\n\t\tif (testFilter !== null) {\n\t\t\tconsole.log(\"Passed \" + stats.passedTests +\n\t\t\t\t\t\" of \" + stats.passedTests + \" tests matching \" + testFilter +\n\t\t\t\t\t\"... \" + \"ALL TESTS PASSED!\".green);\n\t\t} else {\n\t\t\t// Should not happen if it does: Champagne!\n\t\t\tconsole.log(\"Passed \" + stats.passedTests + \" of \" + stats.passedTests +\n\t\t\t\t\t\" tests... \" + \"ALL TESTS PASSED!\".green);\n\t\t}\n\t}\n\n\t// If we logged error messages, complain about it.\n\tvar logMsg = 'No errors logged.'.green;\n\tif (loggedErrorCount > 0) {\n\t\tlogMsg = (loggedErrorCount + \" errors logged.\").red;\n\t}\n\tif (file === null) {\n\t\tif (loggedErrorCount > 0) {\n\t\t\tlogMsg = ('' + loggedErrorCount).red;\n\t\t} else {\n\t\t\tlogMsg = ('' + loggedErrorCount).green;\n\t\t}\n\t\tlogMsg += ' errors logged.';\n\t}\n\tconsole.log(logMsg);\n\n\tvar failures = (\n\t\tstats.passedTestsUnexpected +\n\t\tstats.failedTestsUnexpected +\n\t\tloggedErrorCount\n\t);\n\n\t// If the blacklist changed, complain about it.\n\tif (blacklistChanged) {\n\t\tconsole.log(\"Blacklist changed!\".red);\n\t}\n\n\tif (file === null) {\n\t\tif (failures === 0) {\n\t\t\tconsole.log('--> ' + 'NO UNEXPECTED RESULTS'.green + ' <--');\n\t\t\tif (blacklistChanged) {\n\t\t\t\tconsole.log(\"Perhaps some tests were deleted or renamed.\");\n\t\t\t\tconsole.log(\"Use `bin/parserTests.js --rewrite-blacklist` to update blacklist.\");\n\t\t\t}\n\t\t} else {\n\t\t\tconsole.log(('--> ' + failures + ' UNEXPECTED RESULTS. <--').red);\n\t\t}\n\t}\n\n\treturn failures;\n};\n\nvar prettyPrintIOptions = function(iopts) {\n\tif (!iopts) { return ''; }\n\tvar ppValue = function(v) {\n\t\tif (Array.isArray(v)) {\n\t\t\treturn v.map(ppValue).join(',');\n\t\t}\n\t\tif (typeof v !== 'string') {\n\t\t\treturn JSON.stringify(v);\n\t\t}\n\t\tif (/^\\[\\[[^\\]]*\\]\\]$/.test(v) || /^[-\\w]+$/.test(v)) {\n\t\t\treturn v;\n\t\t}\n\t\treturn JSON.stringify(v);\n\t};\n\treturn Object.keys(iopts).map(function(k) {\n\t\tif (iopts[k] === '') { return k; }\n\t\treturn k + '=' + ppValue(iopts[k]);\n\t}).join(' ');\n};\n\n/**\n * @param {Object} stats\n * @param {Object} item\n * @param {Object} options\n * @param {string} mode\n * @param {string} title\n * @param {Object} actual\n * @param {Object} expected\n * @param {boolean} expectFail Whether this test was expected to fail (on blacklist).\n * @param {boolean} failureOnly Whether we should print only a failure message, or go on to print the diff.\n * @param {Object} bl BlackList.\n * @return {boolean} True if the failure was expected.\n */\nvar printFailure = function(stats, item, options, mode, title, actual, expected, expectFail, failureOnly, bl) {\n\tstats.failedTests++;\n\tstats.modes[mode].failedTests++;\n\tvar fail = {\n\t\ttitle: title,\n\t\traw: actual ? actual.raw : null,\n\t\texpected: expected ? expected.raw : null,\n\t\tactualNormalized: actual ? actual.normal : null,\n\t};\n\tstats.modes[mode].failList.push(fail);\n\n\tconst extTitle = `${title} (${mode})`.replace('\\n', ' ');\n\n\tvar blacklisted = false;\n\tif (ScriptUtils.booleanOption(options.blacklist) && expectFail) {\n\t\t// compare with remembered output\n\t\tvar normalizeAbout = s => s.replace(/(about=\\\\?[\"']#mwt)\\d+/g, '$1');\n\t\tif (normalizeAbout(bl[title][mode]) !== normalizeAbout(actual.raw)) {\n\t\t\tblacklisted = true;\n\t\t} else {\n\t\t\tif (!ScriptUtils.booleanOption(options.quiet)) {\n\t\t\t\tconsole.log('EXPECTED FAIL'.red + ': ' + extTitle.yellow);\n\t\t\t}\n\t\t\treturn true;\n\t\t}\n\t}\n\n\tstats.failedTestsUnexpected++;\n\tstats.modes[mode].failedTestsUnexpected++;\n\tfail.unexpected = true;\n\n\tif (!failureOnly) {\n\t\tconsole.log('=====================================================');\n\t}\n\n\tif (blacklisted) {\n\t\tconsole.log('UNEXPECTED BLACKLIST FAIL'.red.inverse + ': ' + extTitle.yellow);\n\t\tconsole.log('Blacklisted, but the output changed!'.red);\n\t} else {\n\t\tconsole.log('UNEXPECTED FAIL'.red.inverse + ': ' + extTitle.yellow);\n\t}\n\n\tif (mode === 'selser') {\n\t\tif (item.hasOwnProperty('wt2wtPassed') && item.wt2wtPassed) {\n\t\t\tconsole.log('Even worse, the non-selser wt2wt test passed!'.red);\n\t\t} else if (actual && item.hasOwnProperty('wt2wtResult') &&\n\t\t\t\titem.wt2wtResult !== actual.raw) {\n\t\t\tconsole.log('Even worse, the non-selser wt2wt test had a different result!'.red);\n\t\t}\n\t}\n\n\tif (!failureOnly) {\n\t\tconsole.log(item.comments.join('\\n'));\n\t\tif (options) {\n\t\t\tconsole.log('OPTIONS'.cyan + ':');\n\t\t\tconsole.log(prettyPrintIOptions(item.options) + '\\n');\n\t\t}\n\t\tconsole.log('INPUT'.cyan + ':');\n\t\tconsole.log(actual.input + '\\n');\n\t\tconsole.log(options.getActualExpected(actual, expected, options.getDiff));\n\t}\n\n\treturn false;\n};\n\n/**\n * @param {Object} stats\n * @param {Object} item\n * @param {Object} options\n * @param {string} mode\n * @param {string} title\n * @param {boolean} expectSuccess Whether this success was expected (or was this test blacklisted?).\n * @return {boolean} True if the success was expected.\n */\nvar printSuccess = function(stats, item, options, mode, title, expectSuccess) {\n\tvar quiet = ScriptUtils.booleanOption(options.quiet);\n\tstats.passedTests++;\n\tstats.modes[mode].passedTests++;\n\n\tconst extTitle = `${title} (${mode})`.replace('\\n', ' ');\n\n\tif (ScriptUtils.booleanOption(options.blacklist) && !expectSuccess) {\n\t\tstats.passedTestsUnexpected++;\n\t\tstats.modes[mode].passedTestsUnexpected++;\n\t\tconsole.log('UNEXPECTED PASS'.green.inverse +\n\t\t\t':' + extTitle.yellow);\n\t\treturn false;\n\t}\n\tif (!quiet) {\n\t\tvar outStr = 'EXPECTED PASS';\n\n\t\toutStr = outStr.green + ': ' + extTitle.yellow;\n\n\t\tconsole.log(outStr);\n\n\t\tif (mode === 'selser' && item.hasOwnProperty('wt2wtPassed') &&\n\t\t\t\t!item.wt2wtPassed) {\n\t\t\tconsole.log('Even better, the non-selser wt2wt test failed!'.red);\n\t\t}\n\t}\n\treturn true;\n};\n\n/**\n * Print the actual and expected outputs.\n *\n * @param {Object} actual\n * @param {string} actual.raw\n * @param {string} actual.normal\n * @param {Object} expected\n * @param {string} expected.raw\n * @param {string} expected.normal\n * @param {Function} getDiff Returns a string showing the diff(s) for the test.\n * @param {Object} getDiff.actual\n * @param {Object} getDiff.expected\n * @return {string}\n */\nvar getActualExpected = function(actual, expected, getDiff) {\n\tlet mkVisible =\n\t\ts => s.replace(/\\n/g, '\\u21b5\\n'.white).replace(/\\xA0/g, '\\u2423'.white);\n\tif (colors.mode === 'none') {\n\t\tmkVisible = s => s;\n\t}\n\tvar returnStr = '';\n\treturnStr += 'RAW EXPECTED'.cyan + ':\\n';\n\treturnStr += expected.raw + '\\n';\n\n\treturnStr += 'RAW RENDERED'.cyan + ':\\n';\n\treturnStr += actual.raw + '\\n';\n\n\treturnStr += 'NORMALIZED EXPECTED'.magenta + ':\\n';\n\treturnStr += mkVisible(expected.normal) + '\\n';\n\n\treturnStr += 'NORMALIZED RENDERED'.magenta + ':\\n';\n\treturnStr += mkVisible(actual.normal) + '\\n';\n\n\treturnStr += 'DIFF'.cyan + ':\\n';\n\treturnStr += getDiff(actual, expected);\n\n\treturn returnStr;\n};\n\n/**\n * @param {Object} actual\n * @param {string} actual.normal\n * @param {Object} expected\n * @param {string} expected.normal\n * @return {string} Colorized diff\n */\nvar doDiff = function(actual, expected) {\n\t// safe to always request color diff, because we set color mode='none'\n\t// if colors are turned off.\n\tvar e = expected.normal.replace(/\\xA0/g, '\\u2423');\n\tvar a = actual.normal.replace(/\\xA0/g, '\\u2423');\n\treturn Diff.colorDiff(e, a, {\n\t\tcontext: 2,\n\t\tnoColor: (colors.mode === 'none'),\n\t});\n};\n\n/**\n * @param {Function} reportFailure\n * @param {Function} reportSuccess\n * @param {Object} bl BlackList.\n * @param {Object} stats\n * @param {Object} item\n * @param {Object} options\n * @param {string} mode\n * @param {Object} expected\n * @param {Object} actual\n * @param {Function} pre\n * @param {Function} post\n * @return {boolean} True if the result was as expected.\n */\nfunction printResult(reportFailure, reportSuccess, bl, stats, item, options, mode, expected, actual, pre, post) {\n\tvar title = item.title;  // Title may be modified here, so pass it on.\n\n\tvar quick = ScriptUtils.booleanOption(options.quick);\n\n\tif (mode === 'selser') {\n\t\ttitle += ' ' + (item.changes ? JSON.stringify(item.changes) : '[manual]');\n\t} else if (mode === 'wt2html' && item.options.langconv) {\n\t\ttitle += ' [langconv]';\n\t}\n\n\tvar tb = bl[title];\n\tvar expectFail = (tb && tb.hasOwnProperty(mode));\n\tvar fail = (expected.normal !== actual.normal);\n\t// Return whether the test was as expected, independent of pass/fail\n\tvar asExpected;\n\n\tif (mode === 'wt2wt') {\n\t\titem.wt2wtPassed = !fail;\n\t\titem.wt2wtResult = actual.raw;\n\t}\n\n\t// don't report selser fails when nothing was changed or it's a dup\n\tif (\n\t\tmode === 'selser' && !JSUtils.deepEquals(item.changetree, ['manual']) &&\n\t\t(JSUtils.deepEquals(item.changes, []) || item.duplicateChange)\n\t) {\n\t\treturn true;\n\t}\n\n\tif (typeof pre === 'function') {\n\t\tpre(stats, mode, title, item.time);\n\t}\n\n\tif (fail) {\n\t\tasExpected = reportFailure(stats, item, options, mode, title, actual, expected, expectFail, quick, bl);\n\t} else {\n\t\tasExpected = reportSuccess(stats, item, options, mode, title, !expectFail);\n\t}\n\n\tif (typeof post === 'function') {\n\t\tpost(stats, mode);\n\t}\n\n\treturn asExpected;\n}\n\nvar _reportOnce = false;\n/**\n * Simple function for reporting the start of the tests.\n *\n * This method can be reimplemented in the options of the ParserTests object.\n */\nvar reportStartOfTests = function() {\n\tif (!_reportOnce) {\n\t\t_reportOnce = true;\n\t\tconsole.log('ParserTests running with node', process.version);\n\t\tconsole.log('Initialization complete. Now launching tests.');\n\t}\n};\n\n/**\n * Get the actual and expected outputs encoded for XML output.\n *\n * @inheritdoc getActualExpected\n *\n * @return {string} The XML representation of the actual and expected outputs.\n */\nvar getActualExpectedXML = function(actual, expected, getDiff) {\n\tvar returnStr = '';\n\n\treturnStr += 'RAW EXPECTED:\\n';\n\treturnStr += TestUtils.encodeXml(expected.raw) + '\\n\\n';\n\n\treturnStr += 'RAW RENDERED:\\n';\n\treturnStr += TestUtils.encodeXml(actual.raw) + '\\n\\n';\n\n\treturnStr += 'NORMALIZED EXPECTED:\\n';\n\treturnStr += TestUtils.encodeXml(expected.normal) + '\\n\\n';\n\n\treturnStr += 'NORMALIZED RENDERED:\\n';\n\treturnStr += TestUtils.encodeXml(actual.normal) + '\\n\\n';\n\n\treturnStr += 'DIFF:\\n';\n\treturnStr += TestUtils.encodeXml(getDiff(actual, expected, false));\n\n\treturn returnStr;\n};\n\n/**\n * Report the start of the tests output.\n *\n * @inheritdoc reportStart\n */\nvar reportStartXML = function() {};\n\n/**\n * Report the end of the tests output.\n *\n * @inheritdoc reportSummary\n */\nvar reportSummaryXML = function(modesRan, stats, file, loggedErrorCount, testFilter, blacklistChanged) {\n\tif (file === null) {\n\t\t/* Summary for all tests; not included in XML format output. */\n\t\treturn;\n\t}\n\tconsole.log('<testsuites file=\"' + file + '\">');\n\tfor (var i = 0; i < modesRan.length; i++) {\n\t\tvar mode = modesRan[i];\n\t\tconsole.log('<testsuite name=\"parserTests-' + mode + '\">');\n\t\tconsole.log(stats.modes[mode].result);\n\t\tconsole.log('</testsuite>');\n\t}\n\tconsole.log('</testsuites>');\n};\n\n/**\n * Print a failure message for a test in XML.\n *\n * @inheritdoc printFailure\n */\nvar reportFailureXML = function(stats, item, options, mode, title, actual, expected, expectFail, failureOnly, bl) {\n\tstats.failedTests++;\n\tstats.modes[mode].failedTests++;\n\tvar failEle = '';\n\tvar blacklisted = false;\n\tif (ScriptUtils.booleanOption(options.blacklist) && expectFail) {\n\t\t// compare with remembered output\n\t\tblacklisted = (bl[title][mode] === actual.raw);\n\t}\n\tif (!blacklisted) {\n\t\tfailEle += '<failure type=\"parserTestsDifferenceInOutputFailure\">\\n';\n\t\tfailEle += getActualExpectedXML(actual, expected, options.getDiff);\n\t\tfailEle += '\\n</failure>';\n\t\tstats.failedTestsUnexpected++;\n\t\tstats.modes[mode].failedTestsUnexpected++;\n\t\tstats.modes[mode].result += failEle;\n\t}\n};\n\n/**\n * Print a success method for a test in XML.\n *\n * @inheritdoc printSuccess\n */\nvar reportSuccessXML = function(stats, item, options, mode, title, expectSuccess) {\n\tstats.passedTests++;\n\tstats.modes[mode].passedTests++;\n};\n\n/**\n * Print the result of a test in XML.\n *\n * @inheritdoc printResult\n */\nvar reportResultXML = function() {\n\tfunction pre(stats, mode, title, time) {\n\t\tvar testcaseEle;\n\t\ttestcaseEle = '<testcase name=\"' + TestUtils.encodeXml(title) + '\" ';\n\t\ttestcaseEle += 'assertions=\"1\" ';\n\n\t\tvar timeTotal;\n\t\tif (time && time.end && time.start) {\n\t\t\ttimeTotal = time.end - time.start;\n\t\t\tif (!isNaN(timeTotal)) {\n\t\t\t\ttestcaseEle += 'time=\"' + ((time.end - time.start) / 1000.0) + '\"';\n\t\t\t}\n\t\t}\n\n\t\ttestcaseEle += '>';\n\t\tstats.modes[mode].result += testcaseEle;\n\t}\n\n\tfunction post(stats, mode) {\n\t\tstats.modes[mode].result += '</testcase>';\n\t}\n\n\tvar args = Array.from(arguments);\n\targs = [ reportFailureXML, reportSuccessXML ].concat(args, pre, post);\n\tprintResult.apply(this, args);\n\n\t// In xml, test all cases always\n\treturn true;\n};\n\n/**\n * Get the options from the command line.\n *\n * @return {Object}\n */\nvar getOpts = function() {\n\tvar standardOpts = ScriptUtils.addStandardOptions({\n\t\t'wt2html': {\n\t\t\tdescription: 'Wikitext -> HTML(DOM)',\n\t\t\t'default': false,\n\t\t\t'boolean': true,\n\t\t},\n\t\t'html2wt': {\n\t\t\tdescription: 'HTML(DOM) -> Wikitext',\n\t\t\t'default': false,\n\t\t\t'boolean': true,\n\t\t},\n\t\t'wt2wt': {\n\t\t\tdescription: 'Roundtrip testing: Wikitext -> DOM(HTML) -> Wikitext',\n\t\t\t'default': false,\n\t\t\t'boolean': true,\n\t\t},\n\t\t'html2html': {\n\t\t\tdescription: 'Roundtrip testing: HTML(DOM) -> Wikitext -> HTML(DOM)',\n\t\t\t'default': false,\n\t\t\t'boolean': true,\n\t\t},\n\t\t'selser': {\n\t\t\tdescription: 'Roundtrip testing: Wikitext -> DOM(HTML) -> Wikitext (with selective serialization). ' +\n\t\t\t\t'Set to \"noauto\" to just run the tests with manual selser changes.',\n\t\t\t'boolean': false,\n\t\t},\n\t\t'changetree': {\n\t\t\tdescription: 'Changes to apply to parsed HTML to generate new HTML to be serialized (useful with selser)',\n\t\t\t'default': null,\n\t\t\t'boolean': false,\n\t\t},\n\t\t'numchanges': {\n\t\t\tdescription: 'Make multiple different changes to the DOM, run a selser test for each one.',\n\t\t\t'default': 20,\n\t\t\t'boolean': false,\n\t\t},\n\t\t'cache': {\n\t\t\tdescription: 'Get tests cases from cache file',\n\t\t\t'boolean': true,\n\t\t\t'default': false,\n\t\t},\n\t\t'filter': {\n\t\t\tdescription: 'Only run tests whose descriptions match given string',\n\t\t},\n\t\t'regex': {\n\t\t\tdescription: 'Only run tests whose descriptions match given regex',\n\t\t\talias: ['regexp', 're'],\n\t\t},\n\t\t'run-disabled': {\n\t\t\tdescription: 'Run disabled tests',\n\t\t\t'default': false,\n\t\t\t'boolean': true,\n\t\t},\n\t\t'run-php': {\n\t\t\tdescription: 'Run php-only tests',\n\t\t\t'default': false,\n\t\t\t'boolean': true,\n\t\t},\n\t\t'maxtests': {\n\t\t\tdescription: 'Maximum number of tests to run',\n\t\t\t'boolean': false,\n\t\t},\n\t\t'quick': {\n\t\t\tdescription: 'Suppress diff output of failed tests',\n\t\t\t'boolean': true,\n\t\t\t'default': false,\n\t\t},\n\t\t'quiet': {\n\t\t\tdescription: 'Suppress notification of passed tests (shows only failed tests)',\n\t\t\t'boolean': true,\n\t\t\t'default': false,\n\t\t},\n\t\t'blacklist': {\n\t\t\tdescription: 'Compare against expected failures from blacklist',\n\t\t\t'default': true,\n\t\t\t'boolean': true,\n\t\t},\n\t\t'rewrite-blacklist': {\n\t\t\tdescription: 'Update parserTests-blacklist.json with failing tests.',\n\t\t\t'default': false,\n\t\t\t'boolean': true,\n\t\t},\n\t\t'exit-zero': {\n\t\t\tdescription: \"Don't exit with nonzero status if failures are found.\",\n\t\t\t'default': false,\n\t\t\t'boolean': true,\n\t\t},\n\t\txml: {\n\t\t\tdescription: 'Print output in JUnit XML format.',\n\t\t\t'default': false,\n\t\t\t'boolean': true,\n\t\t},\n\t\t'exit-unexpected': {\n\t\t\tdescription: 'Exit after the first unexpected result.',\n\t\t\t'default': false,\n\t\t\t'boolean': true,\n\t\t},\n\t\t'update-tests': {\n\t\t\tdescription: 'Update parserTests.txt with results from wt2html fails.',\n\t\t},\n\t\t'update-unexpected': {\n\t\t\tdescription: 'Update parserTests.txt with results from wt2html unexpected fails.',\n\t\t\t'default': false,\n\t\t\t'boolean': true,\n\t\t},\n\t}, {\n\t\t// override defaults for standard options\n\t\tusePHPPreProcessor: false,\n\t\tfetchConfig: false,\n\t});\n\n\treturn yargs\n\t.usage('Usage: $0 [options] [tests-file]')\n\t.options(standardOpts)\n\t.check(function(argv, aliases) {\n\t\tif (argv.filter === true) {\n\t\t\tthrow \"--filter needs an argument\";\n\t\t}\n\t\tif (argv.regex === true) {\n\t\t\tthrow \"--regex needs an argument\";\n\t\t}\n\t\treturn true;\n\t})\n\t.strict();\n};\n\nTestUtils.prepareOptions = function() {\n\tvar popts = getOpts();\n\tvar options = popts.argv;\n\n\tif (options.help) {\n\t\tpopts.showHelp();\n\t\tconsole.log(\"Additional dump options specific to parserTests script:\");\n\t\tconsole.log(\"* dom:post-changes  : Dumps DOM after applying selser changetree\\n\");\n\t\tconsole.log(\"Examples\");\n\t\tconsole.log(\"$ node parserTests --selser --filter '...' --dump dom:post-changes\");\n\t\tconsole.log(\"$ node parserTests --selser --filter '...' --changetree '...' --dump dom:post-changes\\n\");\n\t\tprocess.exit(0);\n\t}\n\n\tScriptUtils.setColorFlags(options);\n\n\tif (!(options.wt2wt || options.wt2html || options.html2wt || options.html2html || options.selser)) {\n\t\toptions.wt2wt = true;\n\t\toptions.wt2html = true;\n\t\toptions.html2html = true;\n\t\toptions.html2wt = true;\n\t\tif (ScriptUtils.booleanOption(options['rewrite-blacklist'])) {\n\t\t\t// turn on all modes by default for --rewrite-blacklist\n\t\t\toptions.selser = true;\n\t\t\t// sanity checking (T53448 asks to be able to use --filter here)\n\t\t\tif (options.filter || options.regex || options.maxtests || options['exit-unexpected']) {\n\t\t\t\tconsole.log(\"\\nERROR> can't combine --rewrite-blacklist with --filter, --maxtests or --exit-unexpected\");\n\t\t\t\tprocess.exit(1);\n\t\t\t}\n\t\t}\n\t}\n\n\tif (options.xml) {\n\t\toptions.reportResult = reportResultXML;\n\t\toptions.reportStart = reportStartXML;\n\t\toptions.reportSummary = reportSummaryXML;\n\t\toptions.reportFailure = reportFailureXML;\n\t\tcolors.mode = 'none';\n\t}\n\n\tif (typeof options.reportFailure !== 'function') {\n\t\t// default failure reporting is standard out,\n\t\t// see printFailure for documentation of the default.\n\t\toptions.reportFailure = printFailure;\n\t}\n\n\tif (typeof options.reportSuccess !== 'function') {\n\t\t// default success reporting is standard out,\n\t\t// see printSuccess for documentation of the default.\n\t\toptions.reportSuccess = printSuccess;\n\t}\n\n\tif (typeof options.reportStart !== 'function') {\n\t\t// default summary reporting is standard out,\n\t\t// see reportStart for documentation of the default.\n\t\toptions.reportStart = reportStartOfTests;\n\t}\n\n\tif (typeof options.reportSummary !== 'function') {\n\t\t// default summary reporting is standard out,\n\t\t// see reportSummary for documentation of the default.\n\t\toptions.reportSummary = reportSummary;\n\t}\n\n\tif (typeof options.reportResult !== 'function') {\n\t\t// default result reporting is standard out,\n\t\t// see printResult for documentation of the default.\n\t\toptions.reportResult = (...args) => printResult(options.reportFailure, options.reportSuccess, ...args);\n\t}\n\n\tif (typeof options.getDiff !== 'function') {\n\t\t// this is the default for diff-getting, but it can be overridden\n\t\t// see doDiff for documentation of the default.\n\t\toptions.getDiff = doDiff;\n\t}\n\n\tif (typeof options.getActualExpected !== 'function') {\n\t\t// this is the default for getting the actual and expected\n\t\t// outputs, but it can be overridden\n\t\t// see getActualExpected for documentation of the default.\n\t\toptions.getActualExpected = getActualExpected;\n\t}\n\n\toptions.modes = [];\n\n\tif (options.wt2html) {\n\t\toptions.modes.push('wt2html');\n\t}\n\tif (options.wt2wt) {\n\t\toptions.modes.push('wt2wt');\n\t}\n\tif (options.html2html) {\n\t\toptions.modes.push('html2html');\n\t}\n\tif (options.html2wt) {\n\t\toptions.modes.push('html2wt');\n\t}\n\tif (options.selser) {\n\t\toptions.modes.push('selser');\n\t}\n\n\treturn options;\n};\n\n// Hard-code some interwiki prefixes, as is done\n// in parserTest.inc:setupInterwikis()\nTestUtils.iwl = {\n\tlocal: {\n\t\turl: 'http://doesnt.matter.org/$1',\n\t\tlocalinterwiki: '',\n\t},\n\twikipedia: {\n\t\turl: 'http://en.wikipedia.org/wiki/$1',\n\t},\n\tmeatball: {\n\t\t// this has been updated in the live wikis, but the parser tests\n\t\t// expect the old value (as set in parserTest.inc:setupInterwikis())\n\t\turl: 'http://www.usemod.com/cgi-bin/mb.pl?$1',\n\t},\n\tmemoryalpha: {\n\t\turl: 'http://www.memory-alpha.org/en/index.php/$1',\n\t},\n\tzh: {\n\t\turl: 'http://zh.wikipedia.org/wiki/$1',\n\t\tlanguage: '\\u4e2d\\u6587',\n\t\tlocal: '',\n\t},\n\tes: {\n\t\turl: 'http://es.wikipedia.org/wiki/$1',\n\t\tlanguage: 'espa\\u00f1ol',\n\t\tlocal: '',\n\t},\n\tfr: {\n\t\turl: 'http://fr.wikipedia.org/wiki/$1',\n\t\tlanguage: 'fran\\u00e7ais',\n\t\tlocal: '',\n\t},\n\tru: {\n\t\turl: 'http://ru.wikipedia.org/wiki/$1',\n\t\tlanguage: '\\u0440\\u0443\\u0441\\u0441\\u043a\\u0438\\u0439',\n\t\tlocal: '',\n\t},\n\tmi: {\n\t\turl: 'http://mi.wikipedia.org/wiki/$1',\n\t\t// better for testing if one of the\n\t\t// localinterwiki prefixes is also a\n\t\t// language\n\t\tlanguage: 'Test',\n\t\tlocal: '',\n\t\tlocalinterwiki: '',\n\t},\n\tmul: {\n\t\turl: 'http://wikisource.org/wiki/$1',\n\t\textralanglink: '',\n\t\tlinktext: 'Multilingual',\n\t\tsitename: 'WikiSource',\n\t\tlocal: '',\n\t},\n\t// not in PHP setupInterwikis(), but needed\n\ten: {\n\t\turl: 'http://en.wikipedia.org/wiki/$1',\n\t\tlanguage: 'English',\n\t\tlocal: '',\n\t\tprotorel: '',\n\t},\n\tstats: {\n\t\tlocal: '',\n\t\turl: 'https://stats.wikimedia.org/$1'\n\t},\n\tgerrit: {\n\t\tlocal: '',\n\t\turl: 'https://gerrit.wikimedia.org/$1'\n\t}\n};\n\nTestUtils.addNamespace = function(wikiConf, name) {\n\tvar nsid = name.id;\n\tvar old = wikiConf.siteInfo.namespaces[nsid];\n\tif (old) {  // Id may already be defined; if so, clear it.\n\t\tif (old === name) { return; }  // ParserTests does a lot redundantly.\n\t\twikiConf.namespaceIds.delete(Util.normalizeNamespaceName(old['*']));\n\t\twikiConf.canonicalNamespaces[Util.normalizeNamespaceName(old.canonical ? old.canonical : old['*'])] = undefined;\n\t}\n\twikiConf.namespaceNames[nsid] = name['*'];\n\twikiConf.namespaceIds.set(Util.normalizeNamespaceName(name['*']), Number(nsid));\n\twikiConf.canonicalNamespaces[Util.normalizeNamespaceName(name.canonical ? name.canonical : name['*'])] = Number(nsid);\n\twikiConf.namespacesWithSubpages[nsid] = true;\n\twikiConf.siteInfo.namespaces[nsid] = name;\n};\n\nif (typeof module === \"object\") {\n\tmodule.exports.TestUtils = TestUtils;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/api-testing/Parsoid.js","messages":[{"ruleId":"node/no-deprecated-api","severity":2,"message":"'url.parse' was deprecated since v11.0.0. Use 'url.URL' constructor instead.","line":48,"column":20,"nodeType":"MemberExpression","endLine":48,"endColumn":29}],"errorCount":1,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/** Cases for testing the Parsoid API through HTTP */\n/* global describe, it */\n\n'use strict';\n\nconst { REST } = require( 'api-testing' );\n\nvar domino = require('domino');\nvar should = require('chai').should();\nvar semver = require('semver');\nvar url = require('url');\n\nvar Util = require('../../lib/utils/Util.js').Util;\n\nconst parsoidOptions = {\n\tlimits: {\n\t\twt2html: { maxWikitextSize: 20000 },\n\t\thtml2wt: { maxHTMLSize: 10000 },\n\t},\n};\n\nvar defaultContentVersion = '2.2.0';\n\n// section wrappers are a distraction from the main business of\n// this file which is to verify functionality of API end points\n// independent of what they are returning and computing.\n//\n// Verifying the correctness of content is actually the job of\n// parser tests and other tests.\n//\n// So, hide most of that that distraction in a helper.\n//\n// Right now, all uses of this helper have empty lead sections.\n// But, maybe in the future, this may change. So, retain the option.\nfunction validateDoc(doc, nodeName, emptyLead) {\n\tvar leadSection = doc.body.firstChild;\n\tleadSection.nodeName.should.equal('SECTION');\n\tif (emptyLead) {\n\t\t// Could have whitespace and comments\n\t\tleadSection.childElementCount.should.equal(0);\n\t}\n\tvar nonEmptySection = emptyLead ? leadSection.nextSibling : leadSection;\n\tnonEmptySection.firstChild.nodeName.should.equal(nodeName);\n}\n\ndescribe('Parsoid API', function() {\n\tconst client = new REST();\n\tconst parsedUrl = url.parse(client.req.app);\n\tconst PARSOID_URL = parsedUrl.href;\n\tconst hostname = parsedUrl.hostname;\n\tconst mockDomain = client.pathPrefix = `rest.php/${hostname}`;\n\n\tdescribe('formats', function() {\n\n\t\tit('should accept application/x-www-form-urlencoded', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.type('form')\n\t\t\t.send({\n\t\t\t\twikitext: '== h2 ==',\n\t\t\t})\n\t\t\t.expect(200)\n\t\t\t.expect(function(res) {\n\t\t\t\tvalidateDoc(domino.createDocument(res.text), 'H2', true);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept application/json', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.type('json')\n\t\t\t.send({\n\t\t\t\twikitext: '== h2 ==',\n\t\t\t})\n\t\t\t.expect(200)\n\t\t\t.expect(function(res) {\n\t\t\t\tvalidateDoc(domino.createDocument(res.text), 'H2', true);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept multipart/form-data', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.field('wikitext', '== h2 ==')\n\t\t\t.expect(200)\n\t\t\t.expect(function(res) {\n\t\t\t\tvalidateDoc(domino.createDocument(res.text), 'H2', true);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\t// Skipped because all errors are returned as JSON in Parsoid/PHP\n\t\tit.skip('should return a plaintext error', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/wikitext/Doesnotexist')\n\t\t\t.expect(404)\n\t\t\t.expect(function(res) {\n\t\t\t\tres.headers['content-type'].should.equal(\n\t\t\t\t\t'text/plain; charset=utf-8'\n\t\t\t\t);\n\t\t\t\tres.text.should.equal('Did not find page revisions for Doesnotexist');\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should return a json error', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/pagebundle/Doesnotexist')\n\t\t\t.expect(404)\n\t\t\t.expect(function(res) {\n\t\t\t\tres.headers['content-type'].should.equal(\n\t\t\t\t\t'application/json'\n\t\t\t\t);\n\t\t\t\tres.body.message.should.equal('The specified revision does not exist.');\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\t// Skipped because all errors are returned as JSON in Parsoid/PHP\n\t\tit.skip('should return an html error', function(done) {\n\t\t\tclient.req\n\t\t\t.get('<img src=x onerror=\"javascript:alert(\\'hi\\')\">/v3/page/html/XSS')\n\t\t\t.expect(404)\n\t\t\t.expect(function(res) {\n\t\t\t\tres.headers['content-type'].should.equal(\n\t\t\t\t\t'text/html; charset=utf-8'\n\t\t\t\t);\n\t\t\t\tres.text.should.equal('Invalid domain: &lt;img src=x onerror=&quot;javascript:alert(&apos;hi&apos;)&quot;&gt;');\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t});  // formats\n\n\tvar acceptableHtmlResponse = function(contentVersion, expectFunc) {\n\t\treturn function(res) {\n\t\t\tres.statusCode.should.equal(200);\n\t\t\tres.headers.should.have.property('content-type');\n\t\t\tres.headers['content-type'].should.equal(\n\t\t\t\t'text/html; charset=utf-8; profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + contentVersion + '\"'\n\t\t\t);\n\t\t\tres.text.should.not.equal('');\n\t\t\tif (expectFunc) {\n\t\t\t\treturn expectFunc(res.text);\n\t\t\t}\n\t\t};\n\t};\n\n\tvar acceptablePageBundleResponse = function(contentVersion, expectFunc) {\n\t\treturn function(res) {\n\t\t\tres.statusCode.should.equal(200);\n\t\t\tres.headers.should.have.property('content-type');\n\t\t\tres.headers['content-type'].should.equal(\n\t\t\t\t'application/json; charset=utf-8; profile=\"https://www.mediawiki.org/wiki/Specs/pagebundle/' + contentVersion + '\"'\n\t\t\t);\n\t\t\tres.body.should.have.property('html');\n\t\t\tres.body.html.should.have.property('headers');\n\t\t\tres.body.html.headers.should.have.property('content-type');\n\t\t\tres.body.html.headers['content-type'].should.equal(\n\t\t\t\t'text/html; charset=utf-8; profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + contentVersion + '\"'\n\t\t\t);\n\t\t\tres.body.html.should.have.property('body');\n\t\t\tres.body.should.have.property('data-parsoid');\n\t\t\tres.body['data-parsoid'].should.have.property('headers');\n\t\t\tres.body['data-parsoid'].headers.should.have.property('content-type');\n\t\t\tres.body['data-parsoid'].headers['content-type'].should.equal(\n\t\t\t\t'application/json; charset=utf-8; profile=\"https://www.mediawiki.org/wiki/Specs/data-parsoid/' + contentVersion + '\"'\n\t\t\t);\n\t\t\tres.body['data-parsoid'].should.have.property('body');\n\t\t\tif (semver.gte(contentVersion, '999.0.0')) {\n\t\t\t\tres.body.should.have.property('data-mw');\n\t\t\t\tres.body['data-mw'].should.have.property('headers');\n\t\t\t\tres.body['data-mw'].headers.should.have.property('content-type');\n\t\t\t\tres.body['data-mw'].headers['content-type'].should.equal(\n\t\t\t\t\t'application/json; charset=utf-8; profile=\"https://www.mediawiki.org/wiki/Specs/data-mw/' + contentVersion + '\"'\n\t\t\t\t);\n\t\t\t\tres.body['data-mw'].should.have.property('body');\n\t\t\t}\n\t\t\tif (expectFunc) {\n\t\t\t\treturn expectFunc(res.body.html.body);\n\t\t\t}\n\t\t};\n\t};\n\n\tdescribe('accepts', function() {\n\n\t\tit('should not accept requests for older content versions (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.set('Accept', 'text/html; profile=\"https://www.mediawiki.org/wiki/Specs/HTML/0.0.0\"')\n\t\t\t.send({ wikitext: '== h2 ==' })\n\t\t\t.expect(406)\n\t\t\t.expect(function(res) {\n\t\t\t\t// FIXME: See skipped html error test above\n\t\t\t\tJSON.parse(res.error.text).message.should.equal(\n\t\t\t\t\t'Not acceptable'\n\t\t\t\t);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not accept requests for older content versions (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.set('Accept', 'application/json; profile=\"https://www.mediawiki.org/wiki/Specs/HTML/0.0.0\"')\n\t\t\t.send({ wikitext: '== h2 ==' })\n\t\t\t.expect(406)\n\t\t\t.expect(function(res) {\n\t\t\t\tJSON.parse(res.error.text).message.should.equal(\n\t\t\t\t\t'Not acceptable'\n\t\t\t\t);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not accept requests for other profiles (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.set('Accept', 'text/html; profile=\"something different\"')\n\t\t\t.send({ wikitext: '== h2 ==' })\n\t\t\t.expect(406)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not accept requests for other profiles (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.set('Accept', 'application/json; profile=\"something different\"')\n\t\t\t.send({ wikitext: '== h2 ==' })\n\t\t\t.expect(406)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept wildcards (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.set('Accept', '*/*')\n\t\t\t.send({ wikitext: '== h2 ==' })\n\t\t\t.expect(200)\n\t\t\t.expect(acceptableHtmlResponse(defaultContentVersion))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept wildcards (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.set('Accept', '*/*')\n\t\t\t.send({ wikitext: '== h2 ==' })\n\t\t\t.expect(200)\n\t\t\t.expect(acceptablePageBundleResponse(defaultContentVersion))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should prefer higher quality (html)', function(done) {\n\t\t\tvar contentVersion = '999.0.0';\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.set('Accept',\n\t\t\t\t'text/html; profile=\"https://www.mediawiki.org/wiki/Specs/HTML/2.2.0\"; q=0.5,' +\n\t\t\t\t'text/html; profile=\"https://www.mediawiki.org/wiki/Specs/HTML/999.0.0\"; q=0.8')\n\t\t\t.send({ wikitext: '== h2 ==' })\n\t\t\t.expect(200)\n\t\t\t.expect(acceptableHtmlResponse(contentVersion))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should prefer higher quality (pagebundle)', function(done) {\n\t\t\tvar contentVersion = '999.0.0';\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.set('Accept',\n\t\t\t\t'application/json; profile=\"https://www.mediawiki.org/wiki/Specs/pagebundle/2.2.0\"; q=0.5,' +\n\t\t\t\t'application/json; profile=\"https://www.mediawiki.org/wiki/Specs/pagebundle/999.0.0\"; q=0.8')\n\t\t\t.send({ wikitext: '== h2 ==' })\n\t\t\t.expect(200)\n\t\t\t.expect(acceptablePageBundleResponse(contentVersion))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept requests for the latest content version (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.send({ wikitext: '== h2 ==' })\n\t\t\t.expect(200)\n\t\t\t.expect(acceptableHtmlResponse(defaultContentVersion))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept requests for the latest content version (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.send({ wikitext: '== h2 ==' })\n\t\t\t.expect(200)\n\t\t\t.expect(acceptablePageBundleResponse(defaultContentVersion))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept requests for content version 2.x (html)', function(done) {\n\t\t\tvar contentVersion = '2.2.0';\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.set('Accept', 'text/html; profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + contentVersion + '\"')\n\t\t\t.send({ wikitext: '{{1x|hi}}' })\n\t\t\t.expect(200)\n\t\t\t.expect(acceptableHtmlResponse(contentVersion))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept requests for content version 2.x (pagebundle)', function(done) {\n\t\t\tvar contentVersion = '2.2.0';\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.set('Accept', 'application/json; profile=\"https://www.mediawiki.org/wiki/Specs/pagebundle/' + contentVersion + '\"')\n\t\t\t.send({ wikitext: '{{1x|hi}}' })\n\t\t\t.expect(200)\n\t\t\t.expect(acceptablePageBundleResponse(contentVersion, function(html) {\n\t\t\t\t// In < 999.x, data-mw is still inline.\n\t\t\t\thtml.should.match(/\\s+data-mw\\s*=\\s*['\"]/);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\t// Note that these tests aren't that useful directly after a major version bump\n\n\t\tit('should accept requests for older content version 2.x (html)', function(done) {\n\t\t\tvar contentVersion = '2.2.0';\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.set('Accept', 'text/html; profile=\"https://www.mediawiki.org/wiki/Specs/HTML/2.0.0\"')  // Keep this on the older version\n\t\t\t.send({ wikitext: '{{1x|hi}}' })\n\t\t\t.expect(200)\n\t\t\t.expect(acceptableHtmlResponse(contentVersion))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept requests for older content version 2.x (pagebundle)', function(done) {\n\t\t\tvar contentVersion = '2.2.0';\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.set('Accept', 'application/json; profile=\"https://www.mediawiki.org/wiki/Specs/pagebundle/2.0.0\"')  // Keep this on the older version\n\t\t\t.send({ wikitext: '{{1x|hi}}' })\n\t\t\t.expect(200)\n\t\t\t.expect(acceptablePageBundleResponse(contentVersion, function(html) {\n\t\t\t\t// In < 999.x, data-mw is still inline.\n\t\t\t\thtml.should.match(/\\s+data-mw\\s*=\\s*['\"]/);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should sanity check 2.x content (pagebundle)', function(done) {\n\t\t\tvar contentVersion = '2.2.0';\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.set('Accept', 'application/json; profile=\"https://www.mediawiki.org/wiki/Specs/pagebundle/' + contentVersion + '\"')\n\t\t\t.send({ wikitext: '[[File:Audio.oga]]' })\n\t\t\t.expect(200)\n\t\t\t.expect(acceptablePageBundleResponse(contentVersion, function(html) {\n\t\t\t\tvar doc = domino.createDocument(html);\n\t\t\t\tdoc.querySelectorAll('audio').length.should.equal(1);\n\t\t\t\tdoc.querySelectorAll('video').length.should.equal(0);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept requests for content version 999.x (html)', function(done) {\n\t\t\tvar contentVersion = '999.0.0';\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.set('Accept', 'text/html; profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + contentVersion + '\"')\n\t\t\t.send({ wikitext: '{{1x|hi}}' })\n\t\t\t.expect(200)\n\t\t\t.expect(acceptableHtmlResponse(contentVersion))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept requests for content version 999.x (pagebundle)', function(done) {\n\t\t\tvar contentVersion = '999.0.0';\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.set('Accept', 'application/json; profile=\"https://www.mediawiki.org/wiki/Specs/pagebundle/' + contentVersion + '\"')\n\t\t\t.send({ wikitext: '{{1x|hi}}' })\n\t\t\t.expect(200)\n\t\t\t.expect(acceptablePageBundleResponse(contentVersion, function(html) {\n\t\t\t\t// In 999.x, data-mw is in the pagebundle.\n\t\t\t\thtml.should.not.match(/\\s+data-mw\\s*=\\s*['\"]/);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t});  // accepts\n\n\tvar validWikitextResponse = function(expected) {\n\t\treturn function(res) {\n\t\t\tres.statusCode.should.equal(200);\n\t\t\tres.headers.should.have.property('content-type');\n\t\t\tres.headers['content-type'].should.equal(\n\t\t\t\t// note that express does some reordering\n\t\t\t\t'text/plain; charset=utf-8; profile=\"https://www.mediawiki.org/wiki/Specs/wikitext/1.0.0\"'\n\t\t\t);\n\t\t\tif (expected !== undefined) {\n\t\t\t\tres.text.should.equal(expected);\n\t\t\t} else {\n\t\t\t\tres.text.should.not.equal('');\n\t\t\t}\n\t\t};\n\t};\n\n\tvar validHtmlResponse = function(expectFunc) {\n\t\treturn function(res) {\n\t\t\tres.statusCode.should.equal(200);\n\t\t\tres.headers.should.have.property('content-type');\n\t\t\tres.headers['content-type'].should.equal(\n\t\t\t\t'text/html; charset=utf-8; profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + defaultContentVersion + '\"'\n\t\t\t);\n\t\t\tvar doc = domino.createDocument(res.text);\n\t\t\tif (expectFunc) {\n\t\t\t\treturn expectFunc(doc);\n\t\t\t} else {\n\t\t\t\tres.text.should.not.equal('');\n\t\t\t}\n\t\t};\n\t};\n\n\tvar validPageBundleResponse = function(expectFunc) {\n\t\treturn function(res) {\n\t\t\tres.statusCode.should.equal(200);\n\t\t\tres.body.should.have.property('html');\n\t\t\tres.body.html.should.have.property('headers');\n\t\t\tres.body.html.headers.should.have.property('content-type');\n\t\t\tres.body.html.headers['content-type'].should.equal(\n\t\t\t\t'text/html; charset=utf-8; profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + defaultContentVersion + '\"'\n\t\t\t);\n\t\t\tres.body.html.should.have.property('body');\n\t\t\tres.body.should.have.property('data-parsoid');\n\t\t\tres.body['data-parsoid'].should.have.property('headers');\n\t\t\tres.body['data-parsoid'].headers.should.have.property('content-type');\n\t\t\tres.body['data-parsoid'].headers['content-type'].should.equal(\n\t\t\t\t'application/json; charset=utf-8; profile=\"https://www.mediawiki.org/wiki/Specs/data-parsoid/' + defaultContentVersion + '\"'\n\t\t\t);\n\t\t\tres.body['data-parsoid'].should.have.property('body');\n\t\t\t// TODO: Check data-mw when 999.x is the default.\n\t\t\tconsole.assert(!semver.gte(defaultContentVersion, '999.0.0'));\n\t\t\tvar doc = domino.createDocument(res.body.html.body);\n\t\t\tif (expectFunc) {\n\t\t\t\treturn expectFunc(doc, res.body['data-parsoid'].body);\n\t\t\t}\n\t\t};\n\t};\n\n\tdescribe('wt2lint', function() {\n\n\t\tit('should lint the given page', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/lint/Lint_Page/102')\n\t\t\t.expect(200)\n\t\t\t.expect(function(res) {\n\t\t\t\tres.body.should.be.instanceof(Array);\n\t\t\t\tres.body.length.should.equal(1);\n\t\t\t\tres.body[0].type.should.equal('fostered');\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should lint the given wikitext', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/lint/')\n\t\t\t.send({\n\t\t\t\twikitext: {\n\t\t\t\t\theaders: {\n\t\t\t\t\t\t'content-type': 'text/plain;profile=\"https://www.mediawiki.org/wiki/Specs/wikitext/1.0.0\"',\n\t\t\t\t\t},\n\t\t\t\t\tbody: \"{|\\nhi\\n|ho\\n|}\",\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(200)\n\t\t\t.expect(function(res) {\n\t\t\t\tres.body.should.be.instanceof(Array);\n\t\t\t\tres.body.length.should.equal(1);\n\t\t\t\tres.body[0].type.should.equal('fostered');\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should lint the given page', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/lint/Lint_Page/102')\n\t\t\t.send({})\n\t\t\t.expect(200)\n\t\t\t.expect(function(res) {\n\t\t\t\tres.body.should.be.instanceof(Array);\n\t\t\t\tres.body.length.should.equal(1);\n\t\t\t\tres.body[0].type.should.equal('fostered');\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should redirect title to latest revision (lint)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/lint/')\n\t\t\t.send({\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Lint_Page',\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(307)  // no revid or wikitext source provided\n\t\t\t.expect(function(res) {\n\t\t\t\tres.headers.should.have.property('location');\n\t\t\t\tres.headers.location.should.equal(\n\t\t\t\t\tPARSOID_URL + mockDomain +\n\t\t\t\t\t'/v3/transform/wikitext/to/lint/Lint%20Page/102'\n\t\t\t\t);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t});\n\n\tdescribe(\"wt2html\", function() {\n\n\t\tit('should redirect title to latest revision (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/html/Main_Page')\n\t\t\t.expect(302)\n\t\t\t.expect(function(res) {\n\t\t\t\tres.headers.should.have.property('location');\n\t\t\t\tres.headers.location.should.equal(\n\t\t\t\t\tPARSOID_URL + mockDomain + '/v3/page/html/Main%20Page/1'\n\t\t\t\t);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should redirect title to latest revision (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/pagebundle/Main_Page')\n\t\t\t.expect(302)\n\t\t\t.expect(function(res) {\n\t\t\t\tres.headers.should.have.property('location');\n\t\t\t\tres.headers.location.should.equal(\n\t\t\t\t\tPARSOID_URL + mockDomain + '/v3/page/pagebundle/Main%20Page/1'\n\t\t\t\t);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should redirect title to latest revision (wikitext)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/wikitext/Main_Page')\n\t\t\t.expect(302)\n\t\t\t.expect(function(res) {\n\t\t\t\tres.headers.should.have.property('location');\n\t\t\t\tres.headers.location.should.equal(\n\t\t\t\t\tPARSOID_URL + mockDomain + '/v3/page/wikitext/Main%20Page/1'\n\t\t\t\t);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit(\"should preserve querystring params while redirecting\", function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/html/Main_Page?test=123')\n\t\t\t.expect(302)\n\t\t\t.expect(function(res) {\n\t\t\t\tres.headers.should.have.property('location');\n\t\t\t\tres.headers.location.should.equal(\n\t\t\t\t\tPARSOID_URL + mockDomain + '/v3/page/html/Main%20Page/1?test=123'\n\t\t\t\t);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should get from a title and revision (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/html/Main_Page/1')\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\t// SECTION -> P\n\t\t\t\tdoc.body.firstChild.firstChild.textContent.should.equal('MediaWiki has been successfully installed.');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\t// Parsoid/PHP isn't really expected to work on old MediaWiki versions\n\t\tit.skip('should get from a title and revision (html, pre-mcr)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/html/Old_Response/999')\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\t// SECTION -> P\n\t\t\t\tdoc.body.firstChild.firstChild.textContent.should.equal('MediaWiki was successfully installed.');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should get from a title and revision (html, json content)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/html/JSON_Page/101')\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\tdoc.body.firstChild.nodeName.should.equal('TABLE');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should get from a title and revision (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/pagebundle/Main_Page/1')\n\t\t\t.expect(validPageBundleResponse())\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should get from a title and revision (pagebundle, json content)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/pagebundle/JSON_Page/101')\n\t\t\t.expect(validPageBundleResponse(function(doc) {\n\t\t\t\tdoc.body.firstChild.nodeName.should.equal('TABLE');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should get from a title and revision (wikitext)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/wikitext/Main_Page/1')\n\t\t\t.expect(validWikitextResponse())\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should set a custom etag for get requests (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/html/Main_Page/1')\n\t\t\t.expect(validHtmlResponse())\n\t\t\t.expect((res) => {\n\t\t\t\tres.headers.should.have.property('etag');\n\t\t\t\tres.headers.etag.should.match(/^W\\/\"1\\//);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should set a custom etag for get requests (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/pagebundle/Main_Page/1')\n\t\t\t.expect(validPageBundleResponse())\n\t\t\t.expect((res) => {\n\t\t\t\tres.headers.should.have.property('etag');\n\t\t\t\tres.headers.etag.should.match(/^W\\/\"1\\//);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept wikitext as a string for html', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.send({\n\t\t\t\twikitext: \"== h2 ==\",\n\t\t\t})\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\tvalidateDoc(doc, 'H2', true);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept json contentmodel as a string for html', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.send({\n\t\t\t\twikitext: '{\"1\":2}',\n\t\t\t\tcontentmodel: 'json',\n\t\t\t})\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\tdoc.body.firstChild.nodeName.should.equal('TABLE');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept wikitext as a string for pagebundle', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.send({\n\t\t\t\twikitext: \"== h2 ==\",\n\t\t\t})\n\t\t\t.expect(validPageBundleResponse(function(doc) {\n\t\t\t\tvalidateDoc(doc, 'H2', true);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept json contentmodel as a string for pagebundle', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.send({\n\t\t\t\twikitext: '{\"1\":2}',\n\t\t\t\tcontentmodel: 'json',\n\t\t\t})\n\t\t\t.expect(validPageBundleResponse(function(doc) {\n\t\t\t\tdoc.body.firstChild.nodeName.should.equal('TABLE');\n\t\t\t\tshould.not.exist(doc.querySelector('*[typeof=\"mw:Error\"]'));\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept wikitext with headers', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.send({\n\t\t\t\twikitext: {\n\t\t\t\t\theaders: {\n\t\t\t\t\t\t'content-type': 'text/plain;profile=\"https://www.mediawiki.org/wiki/Specs/wikitext/1.0.0\"',\n\t\t\t\t\t},\n\t\t\t\t\tbody: \"== h2 ==\",\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\tvalidateDoc(doc, 'H2', true);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should require a title when no wikitext is provided (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.send({})\n\t\t\t.expect(400)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should require a title when no wikitext is provided (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.send({})\n\t\t\t.expect(400)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should error when revision not found (page, html)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/html/Doesnotexist')\n\t\t\t.expect(404)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should error when revision not found (page, pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/pagebundle/Doesnotexist')\n\t\t\t.expect(404)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should error when revision not found (transform, wt2html)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/Doesnotexist')\n\t\t\t.send({})\n\t\t\t.expect(404)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should error when revision not found (transform, wt2pb)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/Doesnotexist')\n\t\t\t.send({})\n\t\t\t.expect(404)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept an original title (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.send({\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Main_Page',\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(307)  // no revid or wikitext source provided\n\t\t\t.expect(function(res) {\n\t\t\t\tres.headers.should.have.property('location');\n\t\t\t\tres.headers.location.should.equal(\n\t\t\t\t\tPARSOID_URL + mockDomain + '/v3/transform/wikitext/to/html/Main%20Page/1'\n\t\t\t\t);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept an original title (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.send({\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Main_Page',\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(307)  // no revid or wikitext source provided\n\t\t\t.expect(function(res) {\n\t\t\t\tres.headers.should.have.property('location');\n\t\t\t\tres.headers.location.should.equal(\n\t\t\t\t\tPARSOID_URL + mockDomain + '/v3/transform/wikitext/to/pagebundle/Main%20Page/1'\n\t\t\t\t);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept an original title, other than main', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.send({\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Lint Page',\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(307)  // no revid or wikitext source provided\n\t\t\t.expect(function(res) {\n\t\t\t\tres.headers.should.have.property('location');\n\t\t\t\tres.headers.location.should.equal(\n\t\t\t\t\tPARSOID_URL + mockDomain + '/v3/transform/wikitext/to/html/Lint%20Page/102'\n\t\t\t\t);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not require a title when empty wikitext is provided (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.send({\n\t\t\t\twikitext: '',\n\t\t\t})\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\tdoc.body.children.length.should.equal(1); // empty lead section\n\t\t\t\tdoc.body.firstChild.nodeName.should.equal('SECTION');\n\t\t\t\tdoc.body.firstChild.children.length.should.equal(0);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not require a title when empty wikitext is provided (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.send({\n\t\t\t\twikitext: '',\n\t\t\t})\n\t\t\t.expect(validPageBundleResponse())\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not require a title when wikitext is provided', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.send({\n\t\t\t\twikitext: \"== h2 ==\",\n\t\t\t})\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\tvalidateDoc(doc, 'H2', true);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not require a rev id when wikitext and a title is provided', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/Main_Page')\n\t\t\t.send({\n\t\t\t\twikitext: \"== h2 ==\",\n\t\t\t})\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\tvalidateDoc(doc, 'H2', true);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept the wikitext source as original data', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/Main_Page/1')\n\t\t\t.send({\n\t\t\t\toriginal: {\n\t\t\t\t\twikitext: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/plain;profile=\"https://www.mediawiki.org/wiki/Specs/wikitext/1.0.0\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: \"== h2 ==\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\tvalidateDoc(doc, 'H2', true);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should use the proper source text', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/Main_Page/1')\n\t\t\t.send({\n\t\t\t\toriginal: {\n\t\t\t\t\twikitext: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/plain;profile=\"https://www.mediawiki.org/wiki/Specs/wikitext/1.0.0\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: \"{{1x|foo|bar=bat}}\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\tvalidateDoc(doc, 'P', false);\n\t\t\t\tvar p = doc.querySelector('P[typeof=\"mw:Transclusion\"]');\n\t\t\t\tvar dmw = JSON.parse(p.getAttribute('data-mw'));\n\t\t\t\tvar template = dmw.parts[0].template;\n\t\t\t\ttemplate.target.wt.should.equal('1x');\n\t\t\t\ttemplate.params[1].wt.should.equal('foo');\n\t\t\t\ttemplate.params.bar.wt.should.equal('bat');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept the wikitext source as original without a title or revision', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.send({\n\t\t\t\toriginal: {\n\t\t\t\t\twikitext: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/plain;profile=\"https://www.mediawiki.org/wiki/Specs/wikitext/1.0.0\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: \"== h2 ==\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\tvalidateDoc(doc, 'H2', true);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit(\"should respect body parameter in wikitext->html (body_only)\", function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.send({\n\t\t\t\twikitext: \"''foo''\",\n\t\t\t\tbody_only: 1,\n\t\t\t})\n\t\t\t.expect(validHtmlResponse())\n\t\t\t.expect(function(res) {\n\t\t\t\t// v3 only returns children of <body>\n\t\t\t\tres.text.should.not.match(/<body/);\n\t\t\t\tres.text.should.match(/<p/);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit(\"should respect body parameter in wikitext->pagebundle requests (body_only)\", function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.send({\n\t\t\t\twikitext: \"''foo''\",\n\t\t\t\tbody_only: 1,\n\t\t\t})\n\t\t\t.expect(validPageBundleResponse())\n\t\t\t.expect(function(res) {\n\t\t\t\t// v3 only returns children of <body>\n\t\t\t\tres.body.html.body.should.not.match(/<body/);\n\t\t\t\tres.body.html.body.should.match(/<p/);\n\t\t\t\t// No section wrapping in body-only mode\n\t\t\t\tres.body.html.body.should.not.match(/<section/);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not include captured offsets', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/pagebundle/Main_Page/1')\n\t\t\t.expect(validPageBundleResponse(function(doc, dp) {\n\t\t\t\tdp.should.not.have.property('sectionOffsets');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit(\"should implement subst - simple\", function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.send({ wikitext: \"{{1x|foo}}\", subst: 'true' })\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\tvar body = doc.body;\n\t\t\t\t// <body> should have one child, <section>, the lead section\n\t\t\t\tbody.childElementCount.should.equal(1);\n\t\t\t\tvar p = body.firstChild.firstChild;\n\t\t\t\tp.nodeName.should.equal('P');\n\t\t\t\tp.innerHTML.should.equal('foo');\n\t\t\t\t// The <p> shouldn't be a template expansion, just a plain ol' one\n\t\t\t\tp.hasAttribute('typeof').should.equal(false);\n\t\t\t\t// and it shouldn't have any data-parsoid in it\n\t\t\t\tp.hasAttribute('data-parsoid').should.equal(false);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit(\"should implement subst - internal tranclusion\", function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.send({ wikitext: \"{{1x|foo {{1x|bar}} baz}}\", subst: 'true' })\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\tvar body = doc.body;\n\t\t\t\t// <body> should have one child, <section>, the lead section\n\t\t\t\tbody.childElementCount.should.equal(1);\n\t\t\t\tvar p = body.firstChild.firstChild;\n\t\t\t\tp.nodeName.should.equal('P');\n\t\t\t\t// The <p> shouldn't be a template expansion, just a plain ol' one\n\t\t\t\tp.hasAttribute('typeof').should.equal(false);\n\t\t\t\t// and it shouldn't have any data-parsoid in it\n\t\t\t\tp.hasAttribute('data-parsoid').should.equal(false);\n\t\t\t\t// The internal tranclusion should be presented as such\n\t\t\t\tvar tplp = p.firstChild.nextSibling;\n\t\t\t\ttplp.nodeName.should.equal('SPAN');\n\t\t\t\ttplp.getAttribute('typeof').should.equal('mw:Transclusion');\n\t\t\t\t// And not have data-parsoid, so it's used as new content\n\t\t\t\ttplp.hasAttribute('data-parsoid').should.equal(false);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not allow subst with pagebundle', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.send({ wikitext: \"{{1x|foo}}\", subst: 'true' })\n\t\t\t.expect(501)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should return a request too large error (post wt)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.send({\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Large_Page',\n\t\t\t\t},\n\t\t\t\twikitext: \"a\".repeat(parsoidOptions.limits.wt2html.maxWikitextSize + 1),\n\t\t\t})\n\t\t\t.expect(413)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should return a request too large error (get page)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/html/Large_Page/3')\n\t\t\t.expect(413)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should add redlinks for get (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/html/Redlinks_Page/103')\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\tdoc.body.querySelectorAll('a').length.should.equal(3);\n\t\t\t\tvar redLinks = doc.body.querySelectorAll('.new');\n\t\t\t\tredLinks.length.should.equal(1);\n\t\t\t\tredLinks[0].getAttribute('title').should.equal('Doesnotexist');\n\t\t\t\tvar redirects = doc.body.querySelectorAll('.mw-redirect');\n\t\t\t\tredirects.length.should.equal(1);\n\t\t\t\tredirects[0].getAttribute('title').should.equal('Redirected');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should add redlinks for get (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/pagebundle/Redlinks_Page/103')\n\t\t\t.expect(validPageBundleResponse(function(doc) {\n\t\t\t\tdoc.body.querySelectorAll('a').length.should.equal(3);\n\t\t\t\tvar redLinks = doc.body.querySelectorAll('.new');\n\t\t\t\tredLinks.length.should.equal(1);\n\t\t\t\tredLinks[0].getAttribute('title').should.equal('Doesnotexist');\n\t\t\t\tvar redirects = doc.body.querySelectorAll('.mw-redirect');\n\t\t\t\tredirects.length.should.equal(1);\n\t\t\t\tredirects[0].getAttribute('title').should.equal('Redirected');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should add redlinks for transform (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.send({\n\t\t\t\twikitext: \"[[Special:Version]] [[Doesnotexist]] [[Redirected]]\",\n\t\t\t})\n\t\t\t.expect(validHtmlResponse(function(doc) {\n\t\t\t\tdoc.body.querySelectorAll('a').length.should.equal(3);\n\t\t\t\tvar redLinks = doc.body.querySelectorAll('.new');\n\t\t\t\tredLinks.length.should.equal(1);\n\t\t\t\tredLinks[0].getAttribute('title').should.equal('Doesnotexist');\n\t\t\t\tvar redirects = doc.body.querySelectorAll('.mw-redirect');\n\t\t\t\tredirects.length.should.equal(1);\n\t\t\t\tredirects[0].getAttribute('title').should.equal('Redirected');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should add redlinks for transform (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.send({\n\t\t\t\twikitext: \"[[Special:Version]] [[Doesnotexist]] [[Redirected]]\",\n\t\t\t})\n\t\t\t.expect(validPageBundleResponse(function(doc) {\n\t\t\t\tdoc.body.querySelectorAll('a').length.should.equal(3);\n\t\t\t\tvar redLinks = doc.body.querySelectorAll('.new');\n\t\t\t\tredLinks.length.should.equal(1);\n\t\t\t\tredLinks[0].getAttribute('title').should.equal('Doesnotexist');\n\t\t\t\tvar redirects = doc.body.querySelectorAll('.mw-redirect');\n\t\t\t\tredirects.length.should.equal(1);\n\t\t\t\tredirects[0].getAttribute('title').should.equal('Redirected');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\t// Variant conversion\n\t\tit('should not perform unnecessary variant conversion for get of en page (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/html/Main_Page/1')\n\t\t\t.set('Accept-Language', 'sr-el')\n\t\t\t.expect(validHtmlResponse())\n\t\t\t.expect('Content-Language', 'en')\n\t\t\t.expect((res) => {\n\t\t\t\tconst vary = res.headers.vary || '';\n\t\t\t\tvary.should.not.match(/\\bAccept-Language\\b/i);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not perform unnecessary variant conversion for get of en page (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/pagebundle/Main_Page/1')\n\t\t\t.set('Accept-Language', 'sr-el')\n\t\t\t.expect(validPageBundleResponse())\n\t\t\t.expect((res) => {\n\t\t\t\t// HTTP headers should not be set.\n\t\t\t\tconst vary1 = res.headers.vary || '';\n\t\t\t\tvary1.should.not.match(/\\bAccept-Language\\b/i);\n\t\t\t\tconst lang1 = res.headers['content-language'] || '';\n\t\t\t\tlang1.should.equal('');\n\t\t\t\t// But equivalent headers should be present in the JSON body.\n\t\t\t\tconst headers = res.body.html.headers;\n\t\t\t\tconst vary2 = headers.vary || '';\n\t\t\t\tvary2.should.not.match(/\\bAccept-Language\\b/i);\n\t\t\t\tconst lang2 = headers['content-language'];\n\t\t\t\tlang2.should.equal('en');\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not perform unnecessary variant conversion for get on page w/ magic word (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/html/No_Variant_Page/105')\n\t\t\t.set('Accept-Language', 'sr-el')\n\t\t\t.expect(validHtmlResponse((doc) => {\n\t\t\t\t// No conversion done since __NOCONTENTCONVERT__ is set\n\t\t\t\tdoc.body.textContent.should.equal('абвг abcd\\n');\n\t\t\t}))\n\t\t\t// But the vary/language headers are still set.\n\t\t\t.expect('Content-Language', 'sr-el')\n\t\t\t.expect('Vary', /\\bAccept-Language\\b/i)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not perform unnecessary variant conversion for get on page w/ magic word (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/pagebundle/No_Variant_Page/105')\n\t\t\t.set('Accept-Language', 'sr-el')\n\t\t\t.expect(validPageBundleResponse((doc) => {\n\t\t\t\t// No conversion done since __NOCONTENTCONVERT__ is set\n\t\t\t\tdoc.body.textContent.should.equal('абвг abcd\\n');\n\t\t\t}))\n\t\t\t.expect((res) => {\n\t\t\t\t// HTTP headers should not be set.\n\t\t\t\tconst vary1 = res.headers.vary || '';\n\t\t\t\tvary1.should.not.match(/\\bAccept-Language\\b/i);\n\t\t\t\tconst lang1 = res.headers['content-language'] || '';\n\t\t\t\tlang1.should.equal('');\n\t\t\t\t// But vary/language headers should be set in JSON body.\n\t\t\t\tconst headers = res.body.html.headers;\n\t\t\t\tconst vary2 = headers.vary || '';\n\t\t\t\tvary2.should.match(/\\bAccept-Language\\b/i);\n\t\t\t\tconst lang2 = headers['content-language'];\n\t\t\t\tlang2.should.equal('sr-el');\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not perform unrequested variant conversion for get w/ no accept-language header (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/html/Variant_Page/104')\n\t\t\t// no accept-language header sent\n\t\t\t.expect('Content-Language', 'sr')\n\t\t\t.expect('Vary', /\\bAccept-Language\\b/i)\n\t\t\t.expect(validHtmlResponse((doc) => {\n\t\t\t\tdoc.body.textContent.should.equal('абвг abcd');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not perform unrequested variant conversion for get w/ no accept-language header (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/pagebundle/Variant_Page/104')\n\t\t\t// no accept-language header sent\n\t\t\t.expect(validPageBundleResponse((doc) => {\n\t\t\t\tdoc.body.textContent.should.equal('абвг abcd');\n\t\t\t}))\n\t\t\t.expect((res) => {\n\t\t\t\t// HTTP headers should not be set.\n\t\t\t\tconst vary1 = res.headers.vary || '';\n\t\t\t\tvary1.should.not.match(/\\bAccept-Language\\b/i);\n\t\t\t\tconst lang1 = res.headers['content-language'] || '';\n\t\t\t\tlang1.should.equal('');\n\t\t\t\t// But vary/language headers should be set in JSON body.\n\t\t\t\tconst headers = res.body.html.headers;\n\t\t\t\tconst vary2 = headers.vary || '';\n\t\t\t\tvary2.should.match(/\\bAccept-Language\\b/i);\n\t\t\t\tconst lang2 = headers['content-language'];\n\t\t\t\tlang2.should.equal('sr');\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not perform variant conversion for get w/ base variant specified (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/html/Variant_Page/104')\n\t\t\t.set('Accept-Language', 'sr') // this is base variant\n\t\t\t.expect('Content-Language', 'sr')\n\t\t\t.expect('Vary', /\\bAccept-Language\\b/i)\n\t\t\t.expect(validHtmlResponse((doc) => {\n\t\t\t\tdoc.body.textContent.should.equal('абвг abcd');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not perform variant conversion for get w/ base variant specified (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/pagebundle/Variant_Page/104')\n\t\t\t.set('Accept-Language', 'sr') // this is base variant\n\t\t\t.expect(validPageBundleResponse((doc) => {\n\t\t\t\tdoc.body.textContent.should.equal('абвг abcd');\n\t\t\t}))\n\t\t\t.expect((res) => {\n\t\t\t\t// HTTP headers should not be set.\n\t\t\t\tconst vary1 = res.headers.vary || '';\n\t\t\t\tvary1.should.not.match(/\\bAccept-Language\\b/i);\n\t\t\t\tconst lang1 = res.headers['content-language'] || '';\n\t\t\t\tlang1.should.equal('');\n\t\t\t\t// But vary/language headers should be set in JSON body.\n\t\t\t\tconst headers = res.body.html.headers;\n\t\t\t\tconst vary2 = headers.vary || '';\n\t\t\t\tvary2.should.match(/\\bAccept-Language\\b/i);\n\t\t\t\tconst lang2 = headers['content-language'];\n\t\t\t\tlang2.should.equal('sr');\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not perform variant conversion for get w/ invalid variant specified (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/html/Variant_Page/104')\n\t\t\t.set('Accept-Language', 'sr-BOGUS') // this doesn't exist\n\t\t\t.expect('Content-Language', 'sr')\n\t\t\t.expect('Vary', /\\bAccept-Language\\b/i)\n\t\t\t.expect(validHtmlResponse((doc) => {\n\t\t\t\tdoc.body.textContent.should.equal('абвг abcd');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not perform variant conversion for get w/ invalid variant specified (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/pagebundle/Variant_Page/104')\n\t\t\t.set('Accept-Language', 'sr-BOGUS') // this doesn't exist\n\t\t\t.expect(validPageBundleResponse((doc) => {\n\t\t\t\tdoc.body.textContent.should.equal('абвг abcd');\n\t\t\t}))\n\t\t\t.expect((res) => {\n\t\t\t\t// HTTP headers should not be set.\n\t\t\t\tconst vary1 = res.headers.vary || '';\n\t\t\t\tvary1.should.not.match(/\\bAccept-Language\\b/i);\n\t\t\t\tconst lang1 = res.headers['content-language'] || '';\n\t\t\t\tlang1.should.equal('');\n\t\t\t\t// But vary/language headers should be set in JSON body.\n\t\t\t\tconst headers = res.body.html.headers;\n\t\t\t\tconst vary2 = headers.vary || '';\n\t\t\t\tvary2.should.match(/\\bAccept-Language\\b/i);\n\t\t\t\tconst lang2 = headers['content-language'];\n\t\t\t\tlang2.should.equal('sr');\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should perform variant conversion for get (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/html/Variant_Page/104')\n\t\t\t.set('Accept-Language', 'sr-el')\n\t\t\t.expect('Content-Language', 'sr-el')\n\t\t\t.expect('Vary', /\\bAccept-Language\\b/i)\n\t\t\t.expect(validHtmlResponse((doc) => {\n\t\t\t\tdoc.body.textContent.should.equal('abvg abcd');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should perform variant conversion for get (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.get(mockDomain + '/v3/page/pagebundle/Variant_Page/104')\n\t\t\t.set('Accept-Language', 'sr-el')\n\t\t\t.expect(validPageBundleResponse((doc) => {\n\t\t\t\tdoc.body.textContent.should.equal('abvg abcd');\n\t\t\t}))\n\t\t\t.expect((res) => {\n\t\t\t\tconst headers = res.body.html.headers;\n\t\t\t\theaders.should.have.property('content-language');\n\t\t\t\theaders.should.have.property('vary');\n\t\t\t\theaders['content-language'].should.equal('sr-el');\n\t\t\t\theaders.vary.should.match(/\\bAccept-Language\\b/i);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should perform variant conversion for transform given pagelanguage in HTTP header (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.set('Accept-Language', 'sr-el')\n\t\t\t.set('Content-Language', 'sr')\n\t\t\t.send({\n\t\t\t\twikitext: \"абвг abcd x\",\n\t\t\t})\n\t\t\t.expect('Content-Language', 'sr-el')\n\t\t\t.expect('Vary', /\\bAccept-Language\\b/i)\n\t\t\t.expect(validHtmlResponse((doc) => {\n\t\t\t\tdoc.body.textContent.should.equal('abvg abcd x');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should perform variant conversion for transform given pagelanguage in HTTP header (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.set('Accept-Language', 'sr-el')\n\t\t\t.set('Content-Language', 'sr')\n\t\t\t.send({\n\t\t\t\twikitext: \"абвг abcd x\",\n\t\t\t})\n\t\t\t.expect(validPageBundleResponse((doc) => {\n\t\t\t\tdoc.body.textContent.should.equal('abvg abcd x');\n\t\t\t}))\n\t\t\t.expect((res) => {\n\t\t\t\tconst headers = res.body.html.headers;\n\t\t\t\theaders.should.have.property('content-language');\n\t\t\t\theaders.should.have.property('vary');\n\t\t\t\theaders['content-language'].should.equal('sr-el');\n\t\t\t\theaders.vary.should.match(/\\bAccept-Language\\b/i);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should perform variant conversion for transform given pagelanguage in JSON header (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.set('Accept-Language', 'sr-el')\n\t\t\t.send({\n\t\t\t\twikitext: {\n\t\t\t\t\theaders: {\n\t\t\t\t\t\t'content-language': 'sr',\n\t\t\t\t\t},\n\t\t\t\t\tbody: \"абвг abcd x\",\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect('Content-Language', 'sr-el')\n\t\t\t.expect('Vary', /\\bAccept-Language\\b/i)\n\t\t\t.expect(validHtmlResponse((doc) => {\n\t\t\t\tdoc.body.textContent.should.equal('abvg abcd x');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should perform variant conversion for transform given pagelanguage in JSON header (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.set('Accept-Language', 'sr-el')\n\t\t\t.send({\n\t\t\t\twikitext: {\n\t\t\t\t\theaders: {\n\t\t\t\t\t\t'content-language': 'sr',\n\t\t\t\t\t},\n\t\t\t\t\tbody: \"абвг abcd\",\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validPageBundleResponse((doc) => {\n\t\t\t\tdoc.body.textContent.should.equal('abvg abcd');\n\t\t\t}))\n\t\t\t.expect((res) => {\n\t\t\t\tconst headers = res.body.html.headers;\n\t\t\t\theaders.should.have.property('content-language');\n\t\t\t\theaders.should.have.property('vary');\n\t\t\t\theaders['content-language'].should.equal('sr-el');\n\t\t\t\theaders.vary.should.match(/\\bAccept-Language\\b/i);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should perform variant conversion for transform given pagelanguage from oldid (html)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/html/')\n\t\t\t.set('Accept-Language', 'sr-el')\n\t\t\t.send({\n\t\t\t\toriginal: { revid: 104 },\n\t\t\t\twikitext: {\n\t\t\t\t\tbody: \"абвг abcd x\",\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect('Content-Language', 'sr-el')\n\t\t\t.expect('Vary', /\\bAccept-Language\\b/i)\n\t\t\t.expect(validHtmlResponse((doc) => {\n\t\t\t\tdoc.body.textContent.should.equal('abvg abcd x');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should perform variant conversion for transform given pagelanguage from oldid (pagebundle)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/wikitext/to/pagebundle/')\n\t\t\t.set('Accept-Language', 'sr-el')\n\t\t\t.send({\n\t\t\t\toriginal: { revid: 104 },\n\t\t\t\twikitext: \"абвг abcd\",\n\t\t\t})\n\t\t\t.expect(validPageBundleResponse((doc) => {\n\t\t\t\tdoc.body.textContent.should.equal('abvg abcd');\n\t\t\t}))\n\t\t\t.expect((res) => {\n\t\t\t\tconst headers = res.body.html.headers;\n\t\t\t\theaders.should.have.property('content-language');\n\t\t\t\theaders.should.have.property('vary');\n\t\t\t\theaders['content-language'].should.equal('sr-el');\n\t\t\t\theaders.vary.should.match(/\\bAccept-Language\\b/i);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t}); // end wt2html\n\n\tdescribe(\"html2wt\", function() {\n\n\t\tit('should require html when serializing', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/html/to/wikitext/')\n\t\t\t.send({})\n\t\t\t.expect(400)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should error when revision not found (transform, html2wt)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/html/to/wikitext/Doesnotexist/2020')\n\t\t\t.send({\n\t\t\t\thtml: '<pre>hi ho</pre>'\n\t\t\t})\n\t\t\t.expect(404)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not error when oldid not supplied (transform, html2wt)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/html/to/wikitext/Doesnotexist')\n\t\t\t.send({\n\t\t\t\thtml: '<pre>hi ho</pre>'\n\t\t\t})\n\t\t\t.expect(validWikitextResponse(' hi ho\\n'))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept html as a string', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/html/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<!DOCTYPE html>\\n<html prefix=\"dc: http://purl.org/dc/terms/ mw: http://mediawiki.org/rdf/\" about=\"http://localhost/index.php/Special:Redirect/revision/1\"><head prefix=\"mwr: http://localhost/index.php/Special:Redirect/\"><meta property=\"mw:articleNamespace\" content=\"0\"/><link rel=\"dc:replaces\" resource=\"mwr:revision/0\"/><meta property=\"dc:modified\" content=\"2014-09-12T22:46:59.000Z\"/><meta about=\"mwr:user/0\" property=\"dc:title\" content=\"MediaWiki default\"/><link rel=\"dc:contributor\" resource=\"mwr:user/0\"/><meta property=\"mw:revisionSHA1\" content=\"8e0aa2f2a7829587801db67d0424d9b447e09867\"/><meta property=\"dc:description\" content=\"\"/><link rel=\"dc:isVersionOf\" href=\"http://localhost/index.php/Main_Page\"/><title>Main_Page</title><base href=\"http://localhost/index.php/\"/><link rel=\"stylesheet\" href=\"//localhost/load.php?modules=mediawiki.legacy.commonPrint,shared|mediawiki.skinning.elements|mediawiki.skinning.content|mediawiki.skinning.interface|skins.vector.styles|site|mediawiki.skinning.content.parsoid&amp;only=styles&amp;debug=true&amp;skin=vector\"/></head><body data-parsoid=\\'{\"dsr\":[0,592,0,0]}\\' lang=\"en\" class=\"mw-content-ltr sitedir-ltr ltr mw-body mw-body-content mediawiki\" dir=\"ltr\"><p data-parsoid=\\'{\"dsr\":[0,59,0,0]}\\'><strong data-parsoid=\\'{\"stx\":\"html\",\"dsr\":[0,59,8,9]}\\'>MediaWiki has been successfully installed.</strong></p>\\n\\n<p data-parsoid=\\'{\"dsr\":[61,171,0,0]}\\'>Consult the <a rel=\"mw:ExtLink\" href=\"//meta.wikimedia.org/wiki/Help:Contents\" data-parsoid=\\'{\"targetOff\":114,\"contentOffsets\":[114,126],\"dsr\":[73,127,41,1]}\\'>User\\'s Guide</a> for information on using the wiki software.</p>\\n\\n<h2 data-parsoid=\\'{\"dsr\":[173,194,2,2]}\\'> Getting started </h2>\\n<ul data-parsoid=\\'{\"dsr\":[195,592,0,0]}\\'><li data-parsoid=\\'{\"dsr\":[195,300,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings\" data-parsoid=\\'{\"targetOff\":272,\"contentOffsets\":[272,299],\"dsr\":[197,300,75,1]}\\'>Configuration settings list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[301,373,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ\" data-parsoid=\\'{\"targetOff\":359,\"contentOffsets\":[359,372],\"dsr\":[303,373,56,1]}\\'>MediaWiki FAQ</a></li>\\n<li data-parsoid=\\'{\"dsr\":[374,472,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce\" data-parsoid=\\'{\"targetOff\":441,\"contentOffsets\":[441,471],\"dsr\":[376,472,65,1]}\\'>MediaWiki release mailing list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[473,592,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources\" data-parsoid=\\'{\"targetOff\":555,\"contentOffsets\":[555,591],\"dsr\":[475,592,80,1]}\\'>Localise MediaWiki for your language</a></li></ul></body></html>',\n\t\t\t})\n\t\t\t.expect(validWikitextResponse())\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept html for json contentmodel as a string', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/html/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<!DOCTYPE html>\\n<html prefix=\"dc: http://purl.org/dc/terms/ mw: http://mediawiki.org/rdf/\"><head prefix=\"mwr: http://en.wikipedia.org/wiki/Special:Redirect/\"><meta charset=\"utf-8\"/><meta property=\"mw:articleNamespace\" content=\"0\"/><link rel=\"dc:isVersionOf\" href=\"//en.wikipedia.org/wiki/Main_Page\"/><title></title><base href=\"//en.wikipedia.org/wiki/\"/><link rel=\"stylesheet\" href=\"//en.wikipedia.org/w/load.php?modules=mediawiki.legacy.commonPrint,shared|mediawiki.skinning.elements|mediawiki.skinning.content|mediawiki.skinning.interface|skins.vector.styles|site|mediawiki.skinning.content.parsoid|ext.cite.style&amp;only=styles&amp;skin=vector\"/></head><body lang=\"en\" class=\"mw-content-ltr sitedir-ltr ltr mw-body mw-body-content mediawiki\" dir=\"ltr\"><table class=\"mw-json mw-json-object\"><tbody><tr><th>a</th><td class=\"value mw-json-number\">4</td></tr><tr><th>b</th><td class=\"value mw-json-number\">3</td></tr></tbody></table></body></html>',\n\t\t\t\tcontentmodel: 'json',\n\t\t\t})\n\t\t\t.expect(validWikitextResponse('{\"a\":4,\"b\":3}'))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept html with headers', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/html/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: {\n\t\t\t\t\theaders: {\n\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + defaultContentVersion + '\"',\n\t\t\t\t\t},\n\t\t\t\t\tbody: '<!DOCTYPE html>\\n<html prefix=\"dc: http://purl.org/dc/terms/ mw: http://mediawiki.org/rdf/\" about=\"http://localhost/index.php/Special:Redirect/revision/1\"><head prefix=\"mwr: http://localhost/index.php/Special:Redirect/\"><meta property=\"mw:articleNamespace\" content=\"0\"/><link rel=\"dc:replaces\" resource=\"mwr:revision/0\"/><meta property=\"dc:modified\" content=\"2014-09-12T22:46:59.000Z\"/><meta about=\"mwr:user/0\" property=\"dc:title\" content=\"MediaWiki default\"/><link rel=\"dc:contributor\" resource=\"mwr:user/0\"/><meta property=\"mw:revisionSHA1\" content=\"8e0aa2f2a7829587801db67d0424d9b447e09867\"/><meta property=\"dc:description\" content=\"\"/><link rel=\"dc:isVersionOf\" href=\"http://localhost/index.php/Main_Page\"/><title>Main_Page</title><base href=\"http://localhost/index.php/\"/><link rel=\"stylesheet\" href=\"//localhost/load.php?modules=mediawiki.legacy.commonPrint,shared|mediawiki.skinning.elements|mediawiki.skinning.content|mediawiki.skinning.interface|skins.vector.styles|site|mediawiki.skinning.content.parsoid&amp;only=styles&amp;debug=true&amp;skin=vector\"/></head><body data-parsoid=\\'{\"dsr\":[0,592,0,0]}\\' lang=\"en\" class=\"mw-content-ltr sitedir-ltr ltr mw-body mw-body-content mediawiki\" dir=\"ltr\"><p data-parsoid=\\'{\"dsr\":[0,59,0,0]}\\'><strong data-parsoid=\\'{\"stx\":\"html\",\"dsr\":[0,59,8,9]}\\'>MediaWiki has been successfully installed.</strong></p>\\n\\n<p data-parsoid=\\'{\"dsr\":[61,171,0,0]}\\'>Consult the <a rel=\"mw:ExtLink\" href=\"//meta.wikimedia.org/wiki/Help:Contents\" data-parsoid=\\'{\"targetOff\":114,\"contentOffsets\":[114,126],\"dsr\":[73,127,41,1]}\\'>User\\'s Guide</a> for information on using the wiki software.</p>\\n\\n<h2 data-parsoid=\\'{\"dsr\":[173,194,2,2]}\\'> Getting started </h2>\\n<ul data-parsoid=\\'{\"dsr\":[195,592,0,0]}\\'><li data-parsoid=\\'{\"dsr\":[195,300,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings\" data-parsoid=\\'{\"targetOff\":272,\"contentOffsets\":[272,299],\"dsr\":[197,300,75,1]}\\'>Configuration settings list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[301,373,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ\" data-parsoid=\\'{\"targetOff\":359,\"contentOffsets\":[359,372],\"dsr\":[303,373,56,1]}\\'>MediaWiki FAQ</a></li>\\n<li data-parsoid=\\'{\"dsr\":[374,472,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce\" data-parsoid=\\'{\"targetOff\":441,\"contentOffsets\":[441,471],\"dsr\":[376,472,65,1]}\\'>MediaWiki release mailing list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[473,592,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources\" data-parsoid=\\'{\"targetOff\":555,\"contentOffsets\":[555,591],\"dsr\":[475,592,80,1]}\\'>Localise MediaWiki for your language</a></li></ul></body></html>',\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse())\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should allow a title in the url', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/html/to/wikitext/Main_Page')\n\t\t\t.send({\n\t\t\t\thtml: '<!DOCTYPE html>\\n<html prefix=\"dc: http://purl.org/dc/terms/ mw: http://mediawiki.org/rdf/\" about=\"http://localhost/index.php/Special:Redirect/revision/1\"><head prefix=\"mwr: http://localhost/index.php/Special:Redirect/\"><meta property=\"mw:articleNamespace\" content=\"0\"/><link rel=\"dc:replaces\" resource=\"mwr:revision/0\"/><meta property=\"dc:modified\" content=\"2014-09-12T22:46:59.000Z\"/><meta about=\"mwr:user/0\" property=\"dc:title\" content=\"MediaWiki default\"/><link rel=\"dc:contributor\" resource=\"mwr:user/0\"/><meta property=\"mw:revisionSHA1\" content=\"8e0aa2f2a7829587801db67d0424d9b447e09867\"/><meta property=\"dc:description\" content=\"\"/><link rel=\"dc:isVersionOf\" href=\"http://localhost/index.php/Main_Page\"/><title>Main_Page</title><base href=\"http://localhost/index.php/\"/><link rel=\"stylesheet\" href=\"//localhost/load.php?modules=mediawiki.legacy.commonPrint,shared|mediawiki.skinning.elements|mediawiki.skinning.content|mediawiki.skinning.interface|skins.vector.styles|site|mediawiki.skinning.content.parsoid&amp;only=styles&amp;debug=true&amp;skin=vector\"/></head><body data-parsoid=\\'{\"dsr\":[0,592,0,0]}\\' lang=\"en\" class=\"mw-content-ltr sitedir-ltr ltr mw-body mw-body-content mediawiki\" dir=\"ltr\"><p data-parsoid=\\'{\"dsr\":[0,59,0,0]}\\'><strong data-parsoid=\\'{\"stx\":\"html\",\"dsr\":[0,59,8,9]}\\'>MediaWiki has been successfully installed.</strong></p>\\n\\n<p data-parsoid=\\'{\"dsr\":[61,171,0,0]}\\'>Consult the <a rel=\"mw:ExtLink\" href=\"//meta.wikimedia.org/wiki/Help:Contents\" data-parsoid=\\'{\"targetOff\":114,\"contentOffsets\":[114,126],\"dsr\":[73,127,41,1]}\\'>User\\'s Guide</a> for information on using the wiki software.</p>\\n\\n<h2 data-parsoid=\\'{\"dsr\":[173,194,2,2]}\\'> Getting started </h2>\\n<ul data-parsoid=\\'{\"dsr\":[195,592,0,0]}\\'><li data-parsoid=\\'{\"dsr\":[195,300,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings\" data-parsoid=\\'{\"targetOff\":272,\"contentOffsets\":[272,299],\"dsr\":[197,300,75,1]}\\'>Configuration settings list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[301,373,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ\" data-parsoid=\\'{\"targetOff\":359,\"contentOffsets\":[359,372],\"dsr\":[303,373,56,1]}\\'>MediaWiki FAQ</a></li>\\n<li data-parsoid=\\'{\"dsr\":[374,472,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce\" data-parsoid=\\'{\"targetOff\":441,\"contentOffsets\":[441,471],\"dsr\":[376,472,65,1]}\\'>MediaWiki release mailing list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[473,592,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources\" data-parsoid=\\'{\"targetOff\":555,\"contentOffsets\":[555,591],\"dsr\":[475,592,80,1]}\\'>Localise MediaWiki for your language</a></li></ul></body></html>',\n\t\t\t})\n\t\t\t.expect(validWikitextResponse())\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should allow a title in the original data', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/html/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<!DOCTYPE html>\\n<html prefix=\"dc: http://purl.org/dc/terms/ mw: http://mediawiki.org/rdf/\" about=\"http://localhost/index.php/Special:Redirect/revision/1\"><head prefix=\"mwr: http://localhost/index.php/Special:Redirect/\"><meta property=\"mw:articleNamespace\" content=\"0\"/><link rel=\"dc:replaces\" resource=\"mwr:revision/0\"/><meta property=\"dc:modified\" content=\"2014-09-12T22:46:59.000Z\"/><meta about=\"mwr:user/0\" property=\"dc:title\" content=\"MediaWiki default\"/><link rel=\"dc:contributor\" resource=\"mwr:user/0\"/><meta property=\"mw:revisionSHA1\" content=\"8e0aa2f2a7829587801db67d0424d9b447e09867\"/><meta property=\"dc:description\" content=\"\"/><link rel=\"dc:isVersionOf\" href=\"http://localhost/index.php/Main_Page\"/><title>Main_Page</title><base href=\"http://localhost/index.php/\"/><link rel=\"stylesheet\" href=\"//localhost/load.php?modules=mediawiki.legacy.commonPrint,shared|mediawiki.skinning.elements|mediawiki.skinning.content|mediawiki.skinning.interface|skins.vector.styles|site|mediawiki.skinning.content.parsoid&amp;only=styles&amp;debug=true&amp;skin=vector\"/></head><body data-parsoid=\\'{\"dsr\":[0,592,0,0]}\\' lang=\"en\" class=\"mw-content-ltr sitedir-ltr ltr mw-body mw-body-content mediawiki\" dir=\"ltr\"><p data-parsoid=\\'{\"dsr\":[0,59,0,0]}\\'><strong data-parsoid=\\'{\"stx\":\"html\",\"dsr\":[0,59,8,9]}\\'>MediaWiki has been successfully installed.</strong></p>\\n\\n<p data-parsoid=\\'{\"dsr\":[61,171,0,0]}\\'>Consult the <a rel=\"mw:ExtLink\" href=\"//meta.wikimedia.org/wiki/Help:Contents\" data-parsoid=\\'{\"targetOff\":114,\"contentOffsets\":[114,126],\"dsr\":[73,127,41,1]}\\'>User\\'s Guide</a> for information on using the wiki software.</p>\\n\\n<h2 data-parsoid=\\'{\"dsr\":[173,194,2,2]}\\'> Getting started </h2>\\n<ul data-parsoid=\\'{\"dsr\":[195,592,0,0]}\\'><li data-parsoid=\\'{\"dsr\":[195,300,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings\" data-parsoid=\\'{\"targetOff\":272,\"contentOffsets\":[272,299],\"dsr\":[197,300,75,1]}\\'>Configuration settings list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[301,373,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ\" data-parsoid=\\'{\"targetOff\":359,\"contentOffsets\":[359,372],\"dsr\":[303,373,56,1]}\\'>MediaWiki FAQ</a></li>\\n<li data-parsoid=\\'{\"dsr\":[374,472,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce\" data-parsoid=\\'{\"targetOff\":441,\"contentOffsets\":[441,471],\"dsr\":[376,472,65,1]}\\'>MediaWiki release mailing list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[473,592,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources\" data-parsoid=\\'{\"targetOff\":555,\"contentOffsets\":[555,591],\"dsr\":[475,592,80,1]}\\'>Localise MediaWiki for your language</a></li></ul></body></html>',\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: \"Main_Page\",\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse())\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should allow a revision id in the url', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/html/to/wikitext/Main_Page/1')\n\t\t\t.send({\n\t\t\t\thtml: '<!DOCTYPE html>\\n<html prefix=\"dc: http://purl.org/dc/terms/ mw: http://mediawiki.org/rdf/\" about=\"http://localhost/index.php/Special:Redirect/revision/1\"><head prefix=\"mwr: http://localhost/index.php/Special:Redirect/\"><meta property=\"mw:articleNamespace\" content=\"0\"/><link rel=\"dc:replaces\" resource=\"mwr:revision/0\"/><meta property=\"dc:modified\" content=\"2014-09-12T22:46:59.000Z\"/><meta about=\"mwr:user/0\" property=\"dc:title\" content=\"MediaWiki default\"/><link rel=\"dc:contributor\" resource=\"mwr:user/0\"/><meta property=\"mw:revisionSHA1\" content=\"8e0aa2f2a7829587801db67d0424d9b447e09867\"/><meta property=\"dc:description\" content=\"\"/><link rel=\"dc:isVersionOf\" href=\"http://localhost/index.php/Main_Page\"/><title>Main_Page</title><base href=\"http://localhost/index.php/\"/><link rel=\"stylesheet\" href=\"//localhost/load.php?modules=mediawiki.legacy.commonPrint,shared|mediawiki.skinning.elements|mediawiki.skinning.content|mediawiki.skinning.interface|skins.vector.styles|site|mediawiki.skinning.content.parsoid&amp;only=styles&amp;debug=true&amp;skin=vector\"/></head><body data-parsoid=\\'{\"dsr\":[0,592,0,0]}\\' lang=\"en\" class=\"mw-content-ltr sitedir-ltr ltr mw-body mw-body-content mediawiki\" dir=\"ltr\"><p data-parsoid=\\'{\"dsr\":[0,59,0,0]}\\'><strong data-parsoid=\\'{\"stx\":\"html\",\"dsr\":[0,59,8,9]}\\'>MediaWiki has been successfully installed.</strong></p>\\n\\n<p data-parsoid=\\'{\"dsr\":[61,171,0,0]}\\'>Consult the <a rel=\"mw:ExtLink\" href=\"//meta.wikimedia.org/wiki/Help:Contents\" data-parsoid=\\'{\"targetOff\":114,\"contentOffsets\":[114,126],\"dsr\":[73,127,41,1]}\\'>User\\'s Guide</a> for information on using the wiki software.</p>\\n\\n<h2 data-parsoid=\\'{\"dsr\":[173,194,2,2]}\\'> Getting started </h2>\\n<ul data-parsoid=\\'{\"dsr\":[195,592,0,0]}\\'><li data-parsoid=\\'{\"dsr\":[195,300,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings\" data-parsoid=\\'{\"targetOff\":272,\"contentOffsets\":[272,299],\"dsr\":[197,300,75,1]}\\'>Configuration settings list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[301,373,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ\" data-parsoid=\\'{\"targetOff\":359,\"contentOffsets\":[359,372],\"dsr\":[303,373,56,1]}\\'>MediaWiki FAQ</a></li>\\n<li data-parsoid=\\'{\"dsr\":[374,472,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce\" data-parsoid=\\'{\"targetOff\":441,\"contentOffsets\":[441,471],\"dsr\":[376,472,65,1]}\\'>MediaWiki release mailing list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[473,592,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources\" data-parsoid=\\'{\"targetOff\":555,\"contentOffsets\":[555,591],\"dsr\":[475,592,80,1]}\\'>Localise MediaWiki for your language</a></li></ul></body></html>',\n\t\t\t})\n\t\t\t.expect(validWikitextResponse())\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should allow a revision id in the original data', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/html/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<!DOCTYPE html>\\n<html prefix=\"dc: http://purl.org/dc/terms/ mw: http://mediawiki.org/rdf/\" about=\"http://localhost/index.php/Special:Redirect/revision/1\"><head prefix=\"mwr: http://localhost/index.php/Special:Redirect/\"><meta property=\"mw:articleNamespace\" content=\"0\"/><link rel=\"dc:replaces\" resource=\"mwr:revision/0\"/><meta property=\"dc:modified\" content=\"2014-09-12T22:46:59.000Z\"/><meta about=\"mwr:user/0\" property=\"dc:title\" content=\"MediaWiki default\"/><link rel=\"dc:contributor\" resource=\"mwr:user/0\"/><meta property=\"mw:revisionSHA1\" content=\"8e0aa2f2a7829587801db67d0424d9b447e09867\"/><meta property=\"dc:description\" content=\"\"/><link rel=\"dc:isVersionOf\" href=\"http://localhost/index.php/Main_Page\"/><title>Main_Page</title><base href=\"http://localhost/index.php/\"/><link rel=\"stylesheet\" href=\"//localhost/load.php?modules=mediawiki.legacy.commonPrint,shared|mediawiki.skinning.elements|mediawiki.skinning.content|mediawiki.skinning.interface|skins.vector.styles|site|mediawiki.skinning.content.parsoid&amp;only=styles&amp;debug=true&amp;skin=vector\"/></head><body data-parsoid=\\'{\"dsr\":[0,592,0,0]}\\' lang=\"en\" class=\"mw-content-ltr sitedir-ltr ltr mw-body mw-body-content mediawiki\" dir=\"ltr\"><p data-parsoid=\\'{\"dsr\":[0,59,0,0]}\\'><strong data-parsoid=\\'{\"stx\":\"html\",\"dsr\":[0,59,8,9]}\\'>MediaWiki has been successfully installed.</strong></p>\\n\\n<p data-parsoid=\\'{\"dsr\":[61,171,0,0]}\\'>Consult the <a rel=\"mw:ExtLink\" href=\"//meta.wikimedia.org/wiki/Help:Contents\" data-parsoid=\\'{\"targetOff\":114,\"contentOffsets\":[114,126],\"dsr\":[73,127,41,1]}\\'>User\\'s Guide</a> for information on using the wiki software.</p>\\n\\n<h2 data-parsoid=\\'{\"dsr\":[173,194,2,2]}\\'> Getting started </h2>\\n<ul data-parsoid=\\'{\"dsr\":[195,592,0,0]}\\'><li data-parsoid=\\'{\"dsr\":[195,300,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings\" data-parsoid=\\'{\"targetOff\":272,\"contentOffsets\":[272,299],\"dsr\":[197,300,75,1]}\\'>Configuration settings list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[301,373,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ\" data-parsoid=\\'{\"targetOff\":359,\"contentOffsets\":[359,372],\"dsr\":[303,373,56,1]}\\'>MediaWiki FAQ</a></li>\\n<li data-parsoid=\\'{\"dsr\":[374,472,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce\" data-parsoid=\\'{\"targetOff\":441,\"contentOffsets\":[441,471],\"dsr\":[376,472,65,1]}\\'>MediaWiki release mailing list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[473,592,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources\" data-parsoid=\\'{\"targetOff\":555,\"contentOffsets\":[555,591],\"dsr\":[475,592,80,1]}\\'>Localise MediaWiki for your language</a></li></ul></body></html>',\n\t\t\t\toriginal: {\n\t\t\t\t\trevid: 1,\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse())\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept original wikitext as src', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/html/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<!DOCTYPE html>\\n<html prefix=\"dc: http://purl.org/dc/terms/ mw: http://mediawiki.org/rdf/\" about=\"http://localhost/index.php/Special:Redirect/revision/1\"><head prefix=\"mwr: http://localhost/index.php/Special:Redirect/\"><meta property=\"mw:articleNamespace\" content=\"0\"/><link rel=\"dc:replaces\" resource=\"mwr:revision/0\"/><meta property=\"dc:modified\" content=\"2014-09-12T22:46:59.000Z\"/><meta about=\"mwr:user/0\" property=\"dc:title\" content=\"MediaWiki default\"/><link rel=\"dc:contributor\" resource=\"mwr:user/0\"/><meta property=\"mw:revisionSHA1\" content=\"8e0aa2f2a7829587801db67d0424d9b447e09867\"/><meta property=\"dc:description\" content=\"\"/><link rel=\"dc:isVersionOf\" href=\"http://localhost/index.php/Main_Page\"/><title>Main_Page</title><base href=\"http://localhost/index.php/\"/><link rel=\"stylesheet\" href=\"//localhost/load.php?modules=mediawiki.legacy.commonPrint,shared|mediawiki.skinning.elements|mediawiki.skinning.content|mediawiki.skinning.interface|skins.vector.styles|site|mediawiki.skinning.content.parsoid&amp;only=styles&amp;debug=true&amp;skin=vector\"/></head><body data-parsoid=\\'{\"dsr\":[0,592,0,0]}\\' lang=\"en\" class=\"mw-content-ltr sitedir-ltr ltr mw-body mw-body-content mediawiki\" dir=\"ltr\"><p data-parsoid=\\'{\"dsr\":[0,59,0,0]}\\'><strong data-parsoid=\\'{\"stx\":\"html\",\"dsr\":[0,59,8,9]}\\'>MediaWiki has been successfully installed.</strong></p>\\n\\n<p data-parsoid=\\'{\"dsr\":[61,171,0,0]}\\'>Consult the <a rel=\"mw:ExtLink\" href=\"//meta.wikimedia.org/wiki/Help:Contents\" data-parsoid=\\'{\"targetOff\":114,\"contentOffsets\":[114,126],\"dsr\":[73,127,41,1]}\\'>User\\'s Guide</a> for information on using the wiki software.</p>\\n\\n<h2 data-parsoid=\\'{\"dsr\":[173,194,2,2]}\\'> Getting started </h2>\\n<ul data-parsoid=\\'{\"dsr\":[195,592,0,0]}\\'><li data-parsoid=\\'{\"dsr\":[195,300,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings\" data-parsoid=\\'{\"targetOff\":272,\"contentOffsets\":[272,299],\"dsr\":[197,300,75,1]}\\'>Configuration settings list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[301,373,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ\" data-parsoid=\\'{\"targetOff\":359,\"contentOffsets\":[359,372],\"dsr\":[303,373,56,1]}\\'>MediaWiki FAQ</a></li>\\n<li data-parsoid=\\'{\"dsr\":[374,472,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce\" data-parsoid=\\'{\"targetOff\":441,\"contentOffsets\":[441,471],\"dsr\":[376,472,65,1]}\\'>MediaWiki release mailing list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[473,592,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources\" data-parsoid=\\'{\"targetOff\":555,\"contentOffsets\":[555,591],\"dsr\":[475,592,80,1]}\\'>Localise MediaWiki for your language</a></li></ul></body></html>',\n\t\t\t\toriginal: {\n\t\t\t\t\twikitext: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/plain;profile=\"https://www.mediawiki.org/wiki/Specs/wikitext/1.0.0\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<strong>MediaWiki has been successfully installed.</strong>\\n\\nConsult the [//meta.wikimedia.org/wiki/Help:Contents User\\'s Guide] for information on using the wiki software.\\n\\n== Getting started ==\\n* [//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings Configuration settings list]\\n* [//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ MediaWiki FAQ]\\n* [https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce MediaWiki release mailing list]\\n* [//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources Localise MediaWiki for your language]\\n',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse())\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept original html for selser (default)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<!DOCTYPE html>\\n<html prefix=\"dc: http://purl.org/dc/terms/ mw: http://mediawiki.org/rdf/\" about=\"http://localhost/index.php/Special:Redirect/revision/1\"><head prefix=\"mwr: http://localhost/index.php/Special:Redirect/\"><meta property=\"mw:articleNamespace\" content=\"0\"/><link rel=\"dc:replaces\" resource=\"mwr:revision/0\"/><meta property=\"dc:modified\" content=\"2014-09-12T22:46:59.000Z\"/><meta about=\"mwr:user/0\" property=\"dc:title\" content=\"MediaWiki default\"/><link rel=\"dc:contributor\" resource=\"mwr:user/0\"/><meta property=\"mw:revisionSHA1\" content=\"8e0aa2f2a7829587801db67d0424d9b447e09867\"/><meta property=\"dc:description\" content=\"\"/><link rel=\"dc:isVersionOf\" href=\"http://localhost/index.php/Main_Page\"/><title>Main_Page</title><base href=\"http://localhost/index.php/\"/><link rel=\"stylesheet\" href=\"//localhost/load.php?modules=mediawiki.legacy.commonPrint,shared|mediawiki.skinning.elements|mediawiki.skinning.content|mediawiki.skinning.interface|skins.vector.styles|site|mediawiki.skinning.content.parsoid&amp;only=styles&amp;debug=true&amp;skin=vector\"/></head><body data-parsoid=\\'{\"dsr\":[0,592,0,0]}\\' lang=\"en\" class=\"mw-content-ltr sitedir-ltr ltr mw-body mw-body-content mediawiki\" dir=\"ltr\"><p data-parsoid=\\'{\"dsr\":[0,59,0,0]}\\'><strong data-parsoid=\\'{\"stx\":\"html\",\"dsr\":[0,59,8,9]}\\'>MediaWiki has been successfully installed.</strong></p>\\n\\n<p data-parsoid=\\'{\"dsr\":[61,171,0,0]}\\'>Consult the <a rel=\"mw:ExtLink\" href=\"//meta.wikimedia.org/wiki/Help:Contents\" data-parsoid=\\'{\"targetOff\":114,\"contentOffsets\":[114,126],\"dsr\":[73,127,41,1]}\\'>User\\'s Guide</a> for information on using the wiki software.</p>\\n\\n<h2 data-parsoid=\\'{\"dsr\":[173,194,2,2]}\\'> Getting started </h2>\\n<ul data-parsoid=\\'{\"dsr\":[195,592,0,0]}\\'><li data-parsoid=\\'{\"dsr\":[195,300,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings\" data-parsoid=\\'{\"targetOff\":272,\"contentOffsets\":[272,299],\"dsr\":[197,300,75,1]}\\'>Configuration settings list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[301,373,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ\" data-parsoid=\\'{\"targetOff\":359,\"contentOffsets\":[359,372],\"dsr\":[303,373,56,1]}\\'>MediaWiki FAQ</a></li>\\n<li data-parsoid=\\'{\"dsr\":[374,472,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce\" data-parsoid=\\'{\"targetOff\":441,\"contentOffsets\":[441,471],\"dsr\":[376,472,65,1]}\\'>MediaWiki release mailing list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[473,592,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources\" data-parsoid=\\'{\"targetOff\":555,\"contentOffsets\":[555,591],\"dsr\":[475,592,80,1]}\\'>Localise MediaWiki for your language</a></li></ul></body></html>',\n\t\t\t\toriginal: {\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + defaultContentVersion + '\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: \"<!DOCTYPE html>\\n<html prefix=\\\"dc: http://purl.org/dc/terms/ mw: http://mediawiki.org/rdf/\\\" about=\\\"http://localhost/index.php/Special:Redirect/revision/1\\\"><head prefix=\\\"mwr: http://localhost/index.php/Special:Redirect/\\\"><meta property=\\\"mw:articleNamespace\\\" content=\\\"0\\\"/><link rel=\\\"dc:replaces\\\" resource=\\\"mwr:revision/0\\\"/><meta property=\\\"dc:modified\\\" content=\\\"2014-09-12T22:46:59.000Z\\\"/><meta about=\\\"mwr:user/0\\\" property=\\\"dc:title\\\" content=\\\"MediaWiki default\\\"/><link rel=\\\"dc:contributor\\\" resource=\\\"mwr:user/0\\\"/><meta property=\\\"mw:revisionSHA1\\\" content=\\\"8e0aa2f2a7829587801db67d0424d9b447e09867\\\"/><meta property=\\\"dc:description\\\" content=\\\"\\\"/><link rel=\\\"dc:isVersionOf\\\" href=\\\"http://localhost/index.php/Main_Page\\\"/><title>Main_Page</title><base href=\\\"http://localhost/index.php/\\\"/><link rel=\\\"stylesheet\\\" href=\\\"//localhost/load.php?modules=mediawiki.legacy.commonPrint,shared|mediawiki.skinning.elements|mediawiki.skinning.content|mediawiki.skinning.interface|skins.vector.styles|site|mediawiki.skinning.content.parsoid&amp;only=styles&amp;debug=true&amp;skin=vector\\\"/></head><body id=\\\"mwAA\\\" lang=\\\"en\\\" class=\\\"mw-content-ltr sitedir-ltr ltr mw-body mw-body-content mediawiki\\\" dir=\\\"ltr\\\"><p id=\\\"mwAQ\\\"><strong id=\\\"mwAg\\\">MediaWiki has been successfully installed.</strong></p>\\n\\n<p id=\\\"mwAw\\\">Consult the <a rel=\\\"mw:ExtLink\\\" href=\\\"//meta.wikimedia.org/wiki/Help:Contents\\\" id=\\\"mwBA\\\">User's Guide</a> for information on using the wiki software.</p>\\n\\n<h2 id=\\\"mwBQ\\\"> Getting started </h2>\\n<ul id=\\\"mwBg\\\"><li id=\\\"mwBw\\\"> <a rel=\\\"mw:ExtLink\\\" href=\\\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings\\\" id=\\\"mwCA\\\">Configuration settings list</a></li>\\n<li id=\\\"mwCQ\\\"> <a rel=\\\"mw:ExtLink\\\" href=\\\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ\\\" id=\\\"mwCg\\\">MediaWiki FAQ</a></li>\\n<li id=\\\"mwCw\\\"> <a rel=\\\"mw:ExtLink\\\" href=\\\"https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce\\\" id=\\\"mwDA\\\">MediaWiki release mailing list</a></li>\\n<li id=\\\"mwDQ\\\"> <a rel=\\\"mw:ExtLink\\\" href=\\\"//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources\\\" id=\\\"mwDg\\\">Localise MediaWiki for your language</a></li></ul></body></html>\",\n\t\t\t\t\t},\n\t\t\t\t\t\"data-parsoid\": {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'application/json;profile=\"https://www.mediawiki.org/wiki/Specs/data-parsoid/' + defaultContentVersion + '\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\t\"counter\": 14,\n\t\t\t\t\t\t\t\"ids\": {\n\t\t\t\t\t\t\t\t\"mwAA\": { \"dsr\": [0, 592, 0, 0] }, \"mwAQ\": { \"dsr\": [0, 59, 0, 0] }, \"mwAg\": { \"stx\": \"html\", \"dsr\": [0, 59, 8, 9] }, \"mwAw\": { \"dsr\": [61, 171, 0, 0] }, \"mwBA\": { \"targetOff\": 114, \"contentOffsets\": [114, 126], \"dsr\": [73, 127, 41, 1] }, \"mwBQ\": { \"dsr\": [173, 194, 2, 2] }, \"mwBg\": { \"dsr\": [195, 592, 0, 0] }, \"mwBw\": { \"dsr\": [195, 300, 1, 0] }, \"mwCA\": { \"targetOff\": 272, \"contentOffsets\": [272, 299], \"dsr\": [197, 300, 75, 1] }, \"mwCQ\": { \"dsr\": [301, 373, 1, 0] }, \"mwCg\": { \"targetOff\": 359, \"contentOffsets\": [359, 372], \"dsr\": [303, 373, 56, 1] }, \"mwCw\": { \"dsr\": [374, 472, 1, 0] }, \"mwDA\": { \"targetOff\": 441, \"contentOffsets\": [441, 471], \"dsr\": [376, 472, 65, 1] }, \"mwDQ\": { \"dsr\": [473, 592, 1, 0] }, \"mwDg\": { \"targetOff\": 555, \"contentOffsets\": [555, 591], \"dsr\": [475, 592, 80, 1] },\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse())\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept original html for selser (1.1.1, meta)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<!DOCTYPE html>\\n<html prefix=\"dc: http://purl.org/dc/terms/ mw: http://mediawiki.org/rdf/\" about=\"http://localhost/index.php/Special:Redirect/revision/1\"><head prefix=\"mwr: http://localhost/index.php/Special:Redirect/\"><meta property=\"mw:articleNamespace\" content=\"0\"/><link rel=\"dc:replaces\" resource=\"mwr:revision/0\"/><meta property=\"dc:modified\" content=\"2014-09-12T22:46:59.000Z\"/><meta about=\"mwr:user/0\" property=\"dc:title\" content=\"MediaWiki default\"/><link rel=\"dc:contributor\" resource=\"mwr:user/0\"/><meta property=\"mw:revisionSHA1\" content=\"8e0aa2f2a7829587801db67d0424d9b447e09867\"/><meta property=\"dc:description\" content=\"\"/><meta property=\"mw:html:version\" content=\"1.1.1\"/><link rel=\"dc:isVersionOf\" href=\"http://localhost/index.php/Main_Page\"/><title>Main_Page</title><base href=\"http://localhost/index.php/\"/><link rel=\"stylesheet\" href=\"//localhost/load.php?modules=mediawiki.legacy.commonPrint,shared|mediawiki.skinning.elements|mediawiki.skinning.content|mediawiki.skinning.interface|skins.vector.styles|site|mediawiki.skinning.content.parsoid&amp;only=styles&amp;debug=true&amp;skin=vector\"/></head><body data-parsoid=\\'{\"dsr\":[0,592,0,0]}\\' lang=\"en\" class=\"mw-content-ltr sitedir-ltr ltr mw-body mw-body-content mediawiki\" dir=\"ltr\"><p data-parsoid=\\'{\"dsr\":[0,59,0,0]}\\'><strong data-parsoid=\\'{\"stx\":\"html\",\"dsr\":[0,59,8,9]}\\'>MediaWiki has been successfully installed.</strong></p>\\n\\n<p data-parsoid=\\'{\"dsr\":[61,171,0,0]}\\'>Consult the <a rel=\"mw:ExtLink\" href=\"//meta.wikimedia.org/wiki/Help:Contents\" data-parsoid=\\'{\"targetOff\":114,\"contentOffsets\":[114,126],\"dsr\":[73,127,41,1]}\\'>User\\'s Guide</a> for information on using the wiki software.</p>\\n\\n<h2 data-parsoid=\\'{\"dsr\":[173,194,2,2]}\\'> Getting started </h2>\\n<ul data-parsoid=\\'{\"dsr\":[195,592,0,0]}\\'><li data-parsoid=\\'{\"dsr\":[195,300,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings\" data-parsoid=\\'{\"targetOff\":272,\"contentOffsets\":[272,299],\"dsr\":[197,300,75,1]}\\'>Configuration settings list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[301,373,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ\" data-parsoid=\\'{\"targetOff\":359,\"contentOffsets\":[359,372],\"dsr\":[303,373,56,1]}\\'>MediaWiki FAQ</a></li>\\n<li data-parsoid=\\'{\"dsr\":[374,472,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce\" data-parsoid=\\'{\"targetOff\":441,\"contentOffsets\":[441,471],\"dsr\":[376,472,65,1]}\\'>MediaWiki release mailing list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[473,592,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources\" data-parsoid=\\'{\"targetOff\":555,\"contentOffsets\":[555,591],\"dsr\":[475,592,80,1]}\\'>Localise MediaWiki for your language</a></li></ul></body></html>',\n\t\t\t\toriginal: {\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html; profile=\"mediawiki.org/specs/html/1.1.1\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: \"<!DOCTYPE html>\\n<html prefix=\\\"dc: http://purl.org/dc/terms/ mw: http://mediawiki.org/rdf/\\\" about=\\\"http://localhost/index.php/Special:Redirect/revision/1\\\"><head prefix=\\\"mwr: http://localhost/index.php/Special:Redirect/\\\"><meta property=\\\"mw:articleNamespace\\\" content=\\\"0\\\"/><link rel=\\\"dc:replaces\\\" resource=\\\"mwr:revision/0\\\"/><meta property=\\\"dc:modified\\\" content=\\\"2014-09-12T22:46:59.000Z\\\"/><meta about=\\\"mwr:user/0\\\" property=\\\"dc:title\\\" content=\\\"MediaWiki default\\\"/><link rel=\\\"dc:contributor\\\" resource=\\\"mwr:user/0\\\"/><meta property=\\\"mw:revisionSHA1\\\" content=\\\"8e0aa2f2a7829587801db67d0424d9b447e09867\\\"/><meta property=\\\"dc:description\\\" content=\\\"\\\"/><link rel=\\\"dc:isVersionOf\\\" href=\\\"http://localhost/index.php/Main_Page\\\"/><title>Main_Page</title><base href=\\\"http://localhost/index.php/\\\"/><link rel=\\\"stylesheet\\\" href=\\\"//localhost/load.php?modules=mediawiki.legacy.commonPrint,shared|mediawiki.skinning.elements|mediawiki.skinning.content|mediawiki.skinning.interface|skins.vector.styles|site|mediawiki.skinning.content.parsoid&amp;only=styles&amp;debug=true&amp;skin=vector\\\"/></head><body id=\\\"mwAA\\\" lang=\\\"en\\\" class=\\\"mw-content-ltr sitedir-ltr ltr mw-body mw-body-content mediawiki\\\" dir=\\\"ltr\\\"><p id=\\\"mwAQ\\\"><strong id=\\\"mwAg\\\">MediaWiki has been successfully installed.</strong></p>\\n\\n<p id=\\\"mwAw\\\">Consult the <a rel=\\\"mw:ExtLink\\\" href=\\\"//meta.wikimedia.org/wiki/Help:Contents\\\" id=\\\"mwBA\\\">User's Guide</a> for information on using the wiki software.</p>\\n\\n<h2 id=\\\"mwBQ\\\"> Getting started </h2>\\n<ul id=\\\"mwBg\\\"><li id=\\\"mwBw\\\"> <a rel=\\\"mw:ExtLink\\\" href=\\\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings\\\" id=\\\"mwCA\\\">Configuration settings list</a></li>\\n<li id=\\\"mwCQ\\\"> <a rel=\\\"mw:ExtLink\\\" href=\\\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ\\\" id=\\\"mwCg\\\">MediaWiki FAQ</a></li>\\n<li id=\\\"mwCw\\\"> <a rel=\\\"mw:ExtLink\\\" href=\\\"https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce\\\" id=\\\"mwDA\\\">MediaWiki release mailing list</a></li>\\n<li id=\\\"mwDQ\\\"> <a rel=\\\"mw:ExtLink\\\" href=\\\"//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources\\\" id=\\\"mwDg\\\">Localise MediaWiki for your language</a></li></ul></body></html>\",\n\t\t\t\t\t},\n\t\t\t\t\t\"data-parsoid\": {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'application/json;profile=\"https://www.mediawiki.org/wiki/Specs/data-parsoid/0.0.1\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\t\"counter\": 14,\n\t\t\t\t\t\t\t\"ids\": {\n\t\t\t\t\t\t\t\t\"mwAA\": { \"dsr\": [0, 592, 0, 0] }, \"mwAQ\": { \"dsr\": [0, 59, 0, 0] }, \"mwAg\": { \"stx\": \"html\", \"dsr\": [0, 59, 8, 9] }, \"mwAw\": { \"dsr\": [61, 171, 0, 0] }, \"mwBA\": { \"targetOff\": 114, \"contentOffsets\": [114, 126], \"dsr\": [73, 127, 41, 1] }, \"mwBQ\": { \"dsr\": [173, 194, 2, 2] }, \"mwBg\": { \"dsr\": [195, 592, 0, 0] }, \"mwBw\": { \"dsr\": [195, 300, 1, 0] }, \"mwCA\": { \"targetOff\": 272, \"contentOffsets\": [272, 299], \"dsr\": [197, 300, 75, 1] }, \"mwCQ\": { \"dsr\": [301, 373, 1, 0] }, \"mwCg\": { \"targetOff\": 359, \"contentOffsets\": [359, 372], \"dsr\": [303, 373, 56, 1] }, \"mwCw\": { \"dsr\": [374, 472, 1, 0] }, \"mwDA\": { \"targetOff\": 441, \"contentOffsets\": [441, 471], \"dsr\": [376, 472, 65, 1] }, \"mwDQ\": { \"dsr\": [473, 592, 1, 0] }, \"mwDg\": { \"targetOff\": 555, \"contentOffsets\": [555, 591], \"dsr\": [475, 592, 80, 1] },\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse())\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept original html for selser (1.1.1, headers)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\t// Don't set the mw:html:version so that we get it from the original/headers\n\t\t\t\thtml: '<!DOCTYPE html>\\n<html prefix=\"dc: http://purl.org/dc/terms/ mw: http://mediawiki.org/rdf/\" about=\"http://localhost/index.php/Special:Redirect/revision/1\"><head prefix=\"mwr: http://localhost/index.php/Special:Redirect/\"><meta property=\"mw:articleNamespace\" content=\"0\"/><link rel=\"dc:replaces\" resource=\"mwr:revision/0\"/><meta property=\"dc:modified\" content=\"2014-09-12T22:46:59.000Z\"/><meta about=\"mwr:user/0\" property=\"dc:title\" content=\"MediaWiki default\"/><link rel=\"dc:contributor\" resource=\"mwr:user/0\"/><meta property=\"mw:revisionSHA1\" content=\"8e0aa2f2a7829587801db67d0424d9b447e09867\"/><meta property=\"dc:description\" content=\"\"/><link rel=\"dc:isVersionOf\" href=\"http://localhost/index.php/Main_Page\"/><title>Main_Page</title><base href=\"http://localhost/index.php/\"/><link rel=\"stylesheet\" href=\"//localhost/load.php?modules=mediawiki.legacy.commonPrint,shared|mediawiki.skinning.elements|mediawiki.skinning.content|mediawiki.skinning.interface|skins.vector.styles|site|mediawiki.skinning.content.parsoid&amp;only=styles&amp;debug=true&amp;skin=vector\"/></head><body data-parsoid=\\'{\"dsr\":[0,592,0,0]}\\' lang=\"en\" class=\"mw-content-ltr sitedir-ltr ltr mw-body mw-body-content mediawiki\" dir=\"ltr\"><p data-parsoid=\\'{\"dsr\":[0,59,0,0]}\\'><strong data-parsoid=\\'{\"stx\":\"html\",\"dsr\":[0,59,8,9]}\\'>MediaWiki has been successfully installed.</strong></p>\\n\\n<p data-parsoid=\\'{\"dsr\":[61,171,0,0]}\\'>Consult the <a rel=\"mw:ExtLink\" href=\"//meta.wikimedia.org/wiki/Help:Contents\" data-parsoid=\\'{\"targetOff\":114,\"contentOffsets\":[114,126],\"dsr\":[73,127,41,1]}\\'>User\\'s Guide</a> for information on using the wiki software.</p>\\n\\n<h2 data-parsoid=\\'{\"dsr\":[173,194,2,2]}\\'> Getting started </h2>\\n<ul data-parsoid=\\'{\"dsr\":[195,592,0,0]}\\'><li data-parsoid=\\'{\"dsr\":[195,300,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings\" data-parsoid=\\'{\"targetOff\":272,\"contentOffsets\":[272,299],\"dsr\":[197,300,75,1]}\\'>Configuration settings list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[301,373,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ\" data-parsoid=\\'{\"targetOff\":359,\"contentOffsets\":[359,372],\"dsr\":[303,373,56,1]}\\'>MediaWiki FAQ</a></li>\\n<li data-parsoid=\\'{\"dsr\":[374,472,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce\" data-parsoid=\\'{\"targetOff\":441,\"contentOffsets\":[441,471],\"dsr\":[376,472,65,1]}\\'>MediaWiki release mailing list</a></li>\\n<li data-parsoid=\\'{\"dsr\":[473,592,1,0]}\\'> <a rel=\"mw:ExtLink\" href=\"//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources\" data-parsoid=\\'{\"targetOff\":555,\"contentOffsets\":[555,591],\"dsr\":[475,592,80,1]}\\'>Localise MediaWiki for your language</a></li></ul></body></html>',\n\t\t\t\toriginal: {\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html; profile=\"mediawiki.org/specs/html/1.1.1\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: \"<!DOCTYPE html>\\n<html prefix=\\\"dc: http://purl.org/dc/terms/ mw: http://mediawiki.org/rdf/\\\" about=\\\"http://localhost/index.php/Special:Redirect/revision/1\\\"><head prefix=\\\"mwr: http://localhost/index.php/Special:Redirect/\\\"><meta property=\\\"mw:articleNamespace\\\" content=\\\"0\\\"/><link rel=\\\"dc:replaces\\\" resource=\\\"mwr:revision/0\\\"/><meta property=\\\"dc:modified\\\" content=\\\"2014-09-12T22:46:59.000Z\\\"/><meta about=\\\"mwr:user/0\\\" property=\\\"dc:title\\\" content=\\\"MediaWiki default\\\"/><link rel=\\\"dc:contributor\\\" resource=\\\"mwr:user/0\\\"/><meta property=\\\"mw:revisionSHA1\\\" content=\\\"8e0aa2f2a7829587801db67d0424d9b447e09867\\\"/><meta property=\\\"dc:description\\\" content=\\\"\\\"/><link rel=\\\"dc:isVersionOf\\\" href=\\\"http://localhost/index.php/Main_Page\\\"/><title>Main_Page</title><base href=\\\"http://localhost/index.php/\\\"/><link rel=\\\"stylesheet\\\" href=\\\"//localhost/load.php?modules=mediawiki.legacy.commonPrint,shared|mediawiki.skinning.elements|mediawiki.skinning.content|mediawiki.skinning.interface|skins.vector.styles|site|mediawiki.skinning.content.parsoid&amp;only=styles&amp;debug=true&amp;skin=vector\\\"/></head><body id=\\\"mwAA\\\" lang=\\\"en\\\" class=\\\"mw-content-ltr sitedir-ltr ltr mw-body mw-body-content mediawiki\\\" dir=\\\"ltr\\\"><p id=\\\"mwAQ\\\"><strong id=\\\"mwAg\\\">MediaWiki has been successfully installed.</strong></p>\\n\\n<p id=\\\"mwAw\\\">Consult the <a rel=\\\"mw:ExtLink\\\" href=\\\"//meta.wikimedia.org/wiki/Help:Contents\\\" id=\\\"mwBA\\\">User's Guide</a> for information on using the wiki software.</p>\\n\\n<h2 id=\\\"mwBQ\\\"> Getting started </h2>\\n<ul id=\\\"mwBg\\\"><li id=\\\"mwBw\\\"> <a rel=\\\"mw:ExtLink\\\" href=\\\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings\\\" id=\\\"mwCA\\\">Configuration settings list</a></li>\\n<li id=\\\"mwCQ\\\"> <a rel=\\\"mw:ExtLink\\\" href=\\\"//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ\\\" id=\\\"mwCg\\\">MediaWiki FAQ</a></li>\\n<li id=\\\"mwCw\\\"> <a rel=\\\"mw:ExtLink\\\" href=\\\"https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce\\\" id=\\\"mwDA\\\">MediaWiki release mailing list</a></li>\\n<li id=\\\"mwDQ\\\"> <a rel=\\\"mw:ExtLink\\\" href=\\\"//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources\\\" id=\\\"mwDg\\\">Localise MediaWiki for your language</a></li></ul></body></html>\",\n\t\t\t\t\t},\n\t\t\t\t\t\"data-parsoid\": {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'application/json;profile=\"https://www.mediawiki.org/wiki/Specs/data-parsoid/0.0.1\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\t\"counter\": 14,\n\t\t\t\t\t\t\t\"ids\": {\n\t\t\t\t\t\t\t\t\"mwAA\": { \"dsr\": [0, 592, 0, 0] }, \"mwAQ\": { \"dsr\": [0, 59, 0, 0] }, \"mwAg\": { \"stx\": \"html\", \"dsr\": [0, 59, 8, 9] }, \"mwAw\": { \"dsr\": [61, 171, 0, 0] }, \"mwBA\": { \"targetOff\": 114, \"contentOffsets\": [114, 126], \"dsr\": [73, 127, 41, 1] }, \"mwBQ\": { \"dsr\": [173, 194, 2, 2] }, \"mwBg\": { \"dsr\": [195, 592, 0, 0] }, \"mwBw\": { \"dsr\": [195, 300, 1, 0] }, \"mwCA\": { \"targetOff\": 272, \"contentOffsets\": [272, 299], \"dsr\": [197, 300, 75, 1] }, \"mwCQ\": { \"dsr\": [301, 373, 1, 0] }, \"mwCg\": { \"targetOff\": 359, \"contentOffsets\": [359, 372], \"dsr\": [303, 373, 56, 1] }, \"mwCw\": { \"dsr\": [374, 472, 1, 0] }, \"mwDA\": { \"targetOff\": 441, \"contentOffsets\": [441, 471], \"dsr\": [376, 472, 65, 1] }, \"mwDQ\": { \"dsr\": [473, 592, 1, 0] }, \"mwDg\": { \"targetOff\": 555, \"contentOffsets\": [555, 591], \"dsr\": [475, 592, 80, 1] },\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse())\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should return http 400 if supplied data-parsoid is empty', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<html><head></head><body><p>hi</p></body></html>',\n\t\t\t\toriginal: {\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + defaultContentVersion + '\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<html><head></head><body><p>ho</p></body></html>',\n\t\t\t\t\t},\n\t\t\t\t\t'data-parsoid': {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'application/json;profile=\"https://www.mediawiki.org/wiki/Specs/data-parsoid/' + defaultContentVersion + '\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: {},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(400)\n\t\t\t.end(done);\n\t\t});\n\n\t\t// FIXME: Pagebundle validation in general is needed\n\t\tit.skip('should return http 400 if supplied data-parsoid is a string', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<html><head></head><body><p>hi</p></body></html>',\n\t\t\t\toriginal: {\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + defaultContentVersion + '\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<html><head></head><body><p>ho</p></body></html>',\n\t\t\t\t\t},\n\t\t\t\t\t'data-parsoid': {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'application/json;profile=\"https://www.mediawiki.org/wiki/Specs/data-parsoid/' + defaultContentVersion + '\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: 'Garbled text from RESTBase.',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(400)\n\t\t\t.end(done);\n\t\t});\n\n\t\t// The following three tests should all serialize as:\n\t\t//   \"<div>Selser test\"\n\t\t// However, we're deliberately setting the original wikitext in\n\t\t// the first two to garbage so that when selser doesn't detect any\n\t\t// difference between the new and old html, it'll just reuse that\n\t\t// string and we have a reliable way of determining that selser\n\t\t// was used.\n\n\t\tit('should use selser with supplied wikitext', function(done) {\n\t\t\t// New and old html are identical, which should produce no diffs\n\t\t\t// and reuse the original wikitext.\n\t\t\tclient.req\n\t\t\t// Need to provide an oldid so that selser mode is enabled\n\t\t\t// Without an oldid, serialization falls back to non-selser wts.\n\t\t\t// The oldid is used to fetch wikitext, but if wikitext is provided\n\t\t\t// (as in this test), it is not used. So, for testing purposes,\n\t\t\t// we can use any old random id, as long as something is present.\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: \"<html><body id=\\\"mwAA\\\"><div id=\\\"mwBB\\\">Selser test</div></body></html>\",\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: \"Junk Page\",\n\t\t\t\t\trevid: 1234,\n\t\t\t\t\twikitext: {\n\t\t\t\t\t\tbody: \"1. This is just some junk. See the comment above.\",\n\t\t\t\t\t},\n\t\t\t\t\thtml: {\n\t\t\t\t\t\tbody: \"<html><body id=\\\"mwAA\\\"><div id=\\\"mwBB\\\">Selser test</div></body></html>\",\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + defaultContentVersion + '\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t\"data-parsoid\": {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\t\"ids\": {\n\t\t\t\t\t\t\t\tmwAA: {},\n\t\t\t\t\t\t\t\tmwBB: { \"autoInsertedEnd\": true, \"stx\": \"html\" },\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse(\n\t\t\t\t\"1. This is just some junk. See the comment above.\"\n\t\t\t))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should use selser with wikitext fetched from the mw api', function(done) {\n\t\t\t// New and old html are identical, which should produce no diffs\n\t\t\t// and reuse the original wikitext.\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: \"<html><body id=\\\"mwAA\\\"><div id=\\\"mwBB\\\">Selser test</div></body></html>\",\n\t\t\t\toriginal: {\n\t\t\t\t\trevid: 2,\n\t\t\t\t\ttitle: \"Junk Page\",\n\t\t\t\t\thtml: {\n\t\t\t\t\t\tbody: \"<html><body id=\\\"mwAA\\\"><div id=\\\"mwBB\\\">Selser test</div></body></html>\",\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + defaultContentVersion + '\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t\"data-parsoid\": {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\t\"ids\": {\n\t\t\t\t\t\t\t\tmwAA: {},\n\t\t\t\t\t\t\t\tmwBB: { \"autoInsertedEnd\": true, \"stx\": \"html\" },\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse(\n\t\t\t\t\"2. This is just some junk. See the comment above.\"\n\t\t\t))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should fallback to non-selective serialization', function(done) {\n\t\t\t// Without the original wikitext and an unavailable\n\t\t\t// TemplateFetch for the source (no revision id provided),\n\t\t\t// it should fallback to non-selective serialization.\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: \"<html><body id=\\\"mwAA\\\"><div id=\\\"mwBB\\\">Selser test</div></body></html>\",\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: \"Junk Page\",\n\t\t\t\t\thtml: {\n\t\t\t\t\t\tbody: \"<html><body id=\\\"mwAA\\\"><div id=\\\"mwBB\\\">Selser test</div></body></html>\",\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + defaultContentVersion + '\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t\"data-parsoid\": {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\t\"ids\": {\n\t\t\t\t\t\t\t\tmwAA: {},\n\t\t\t\t\t\t\t\tmwBB: { \"autoInsertedEnd\": true, \"stx\": \"html\" },\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse(\n\t\t\t\t\"<div>Selser test\"\n\t\t\t))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should apply data-parsoid to duplicated ids', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: \"<html><body id=\\\"mwAA\\\"><div id=\\\"mwBB\\\">data-parsoid test</div><div id=\\\"mwBB\\\">data-parsoid test</div></body></html>\",\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: \"Doesnotexist\",\n\t\t\t\t\thtml: {\n\t\t\t\t\t\tbody: \"<html><body id=\\\"mwAA\\\"><div id=\\\"mwBB\\\">data-parsoid test</div></body></html>\",\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + defaultContentVersion + '\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t\"data-parsoid\": {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\t\"ids\": {\n\t\t\t\t\t\t\t\tmwAA: {},\n\t\t\t\t\t\t\t\tmwBB: { \"autoInsertedEnd\": true, \"stx\": \"html\" },\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse(\n\t\t\t\t\"<div>data-parsoid test<div>data-parsoid test\"\n\t\t\t))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should return a 400 for missing inline data-mw (2.x)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<p about=\"#mwt1\" typeof=\"mw:Transclusion\" id=\"mwAQ\">hi</p>',\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Doesnotexist',\n\t\t\t\t\t'data-parsoid': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: { \"mwAQ\": { \"pi\": [[{ \"k\": \"1\" }]] } },\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/2.2.0\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<p about=\"#mwt1\" typeof=\"mw:Transclusion\" id=\"mwAQ\">ho</p>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(400)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should return a 400 for not supplying data-mw', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<p about=\"#mwt1\" typeof=\"mw:Transclusion\" id=\"mwAQ\">hi</p>',\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Doesnotexist',\n\t\t\t\t\t'data-parsoid': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: { \"mwAQ\": { \"pi\": [[{ \"k\": \"1\" }]] } },\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/999.0.0\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<p about=\"#mwt1\" typeof=\"mw:Transclusion\" id=\"mwAQ\">ho</p>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(400)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should apply original data-mw', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<p about=\"#mwt1\" typeof=\"mw:Transclusion\" id=\"mwAQ\">hi</p>',\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Doesnotexist',\n\t\t\t\t\t'data-parsoid': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: { \"mwAQ\": { \"pi\": [[{ \"k\": \"1\" }]] } },\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t'data-mw': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: { \"mwAQ\": { \"parts\": [{ \"template\": { \"target\": { \"wt\": \"1x\", \"href\": \"./Template:1x\" }, \"params\": { \"1\": { \"wt\": \"hi\" } }, \"i\": 0 } }] } },\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/999.0.0\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<p about=\"#mwt1\" typeof=\"mw:Transclusion\" id=\"mwAQ\">ho</p>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse('{{1x|hi}}'))\n\t\t\t.end(done);\n\t\t});\n\n\t\t// Sanity check data-mw was applied in the previous test\n\t\tit('should return a 400 for missing modified data-mw', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<p about=\"#mwt1\" typeof=\"mw:Transclusion\" id=\"mwAQ\">hi</p>',\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Doesnotexist',\n\t\t\t\t\t'data-parsoid': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: { \"mwAQ\": { \"pi\": [[{ \"k\": \"1\" }]] } },\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t'data-mw': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: { \"mwAQ\": { } },  // Missing data-mw.parts!\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/999.0.0\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<p about=\"#mwt1\" typeof=\"mw:Transclusion\" id=\"mwAQ\">ho</p>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(400)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should give precedence to inline data-mw over original', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<p about=\"#mwt1\" typeof=\"mw:Transclusion\" data-mw=\\'{\"parts\":[{\"template\":{\"target\":{\"wt\":\"1x\",\"href\":\"./Template:1x\"},\"params\":{\"1\":{\"wt\":\"hi\"}},\"i\":0}}]}\\' id=\"mwAQ\">hi</p>',\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Doesnotexist',\n\t\t\t\t\t'data-parsoid': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: { \"mwAQ\": { \"pi\": [[{ \"k\": \"1\" }]] } },\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t'data-mw': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: { \"mwAQ\": { } },  // Missing data-mw.parts!\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/999.0.0\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<p about=\"#mwt1\" typeof=\"mw:Transclusion\" id=\"mwAQ\">ho</p>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse('{{1x|hi}}'))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not apply original data-mw if modified is supplied', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<p about=\"#mwt1\" typeof=\"mw:Transclusion\" id=\"mwAQ\">hi</p>',\n\t\t\t\t'data-mw': {\n\t\t\t\t\tbody: {\n\t\t\t\t\t\tids: { \"mwAQ\": { \"parts\": [{ \"template\": { \"target\": { \"wt\": \"1x\", \"href\": \"./Template:1x\" }, \"params\": { \"1\": { \"wt\": \"hi\" } }, \"i\": 0 } }] } },\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Doesnotexist',\n\t\t\t\t\t'data-parsoid': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: { \"mwAQ\": { \"pi\": [[{ \"k\": \"1\" }]] } },\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t'data-mw': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: { \"mwAQ\": { } },  // Missing data-mw.parts!\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/999.0.0\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<p about=\"#mwt1\" typeof=\"mw:Transclusion\" id=\"mwAQ\">ho</p>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse('{{1x|hi}}'))\n\t\t\t.end(done);\n\t\t});\n\n\t\t// The next three tests, although redundant with the above precedence\n\t\t// tests, are an attempt to show clients the semantics of separate\n\t\t// data-mw in the API.  The main idea is,\n\t\t//\n\t\t//   non-inline-data-mw = modified || original\n\t\t//   inline-data-mw > non-inline-data-mw\n\n\t\tit('should apply original data-mw when modified is absent (captions 1)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<p><span class=\"mw-default-size\" typeof=\"mw:Image\" id=\"mwAg\"><a href=\"./File:Foobar.jpg\" id=\"mwAw\"><img resource=\"./File:Foobar.jpg\" src=\"//upload.wikimedia.org/wikipedia/commons/3/3a/Foobar.jpg\" data-file-width=\"240\" data-file-height=\"28\" data-file-type=\"bitmap\" height=\"28\" width=\"240\" id=\"mwBA\"/></a></span></p>',\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Doesnotexist',\n\t\t\t\t\t'data-parsoid': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: {\n\t\t\t\t\t\t\t\t\"mwAg\": { \"optList\": [{ \"ck\": \"caption\", \"ak\": \"Testing 123\" }] },\n\t\t\t\t\t\t\t\t\"mwAw\": { \"a\": { \"href\": \"./File:Foobar.jpg\" }, \"sa\": {} },\n\t\t\t\t\t\t\t\t\"mwBA\": { \"a\": { \"resource\": \"./File:Foobar.jpg\", \"height\": \"28\", \"width\": \"240\" },\"sa\": { \"resource\": \"File:Foobar.jpg\" } },\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t'data-mw': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: {\n\t\t\t\t\t\t\t\t\"mwAg\": { \"caption\": \"Testing 123\" },\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/999.0.0\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<p><span class=\"mw-default-size\" typeof=\"mw:Image\" id=\"mwAg\"><a href=\"./File:Foobar.jpg\" id=\"mwAw\"><img resource=\"./File:Foobar.jpg\" src=\"//upload.wikimedia.org/wikipedia/commons/3/3a/Foobar.jpg\" data-file-width=\"240\" data-file-height=\"28\" data-file-type=\"bitmap\" height=\"28\" width=\"240\" id=\"mwBA\"/></a></span></p>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse('[[File:Foobar.jpg|Testing 123]]'))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should give precedence to inline data-mw over modified (captions 2)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<p><span class=\"mw-default-size\" typeof=\"mw:Image\" data-mw=\"{}\" id=\"mwAg\"><a href=\"./File:Foobar.jpg\" id=\"mwAw\"><img resource=\"./File:Foobar.jpg\" src=\"//upload.wikimedia.org/wikipedia/commons/3/3a/Foobar.jpg\" data-file-width=\"240\" data-file-height=\"28\" data-file-type=\"bitmap\" height=\"28\" width=\"240\" id=\"mwBA\"/></a></span></p>',\n\t\t\t\t'data-mw': {\n\t\t\t\t\tbody: {\n\t\t\t\t\t\tids: {\n\t\t\t\t\t\t\t\"mwAg\": { \"caption\": \"Testing 123\" },\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Doesnotexist',\n\t\t\t\t\t'data-parsoid': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: {\n\t\t\t\t\t\t\t\t\"mwAg\": { \"optList\": [{ \"ck\": \"caption\", \"ak\": \"Testing 123\" }] },\n\t\t\t\t\t\t\t\t\"mwAw\": { \"a\": { \"href\": \"./File:Foobar.jpg\" }, \"sa\": {} },\n\t\t\t\t\t\t\t\t\"mwBA\": { \"a\": { \"resource\": \"./File:Foobar.jpg\", \"height\": \"28\", \"width\": \"240\" },\"sa\": { \"resource\": \"File:Foobar.jpg\" } },\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t'data-mw': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: {\n\t\t\t\t\t\t\t\t\"mwAg\": { \"caption\": \"Testing 123\" },\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/999.0.0\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<p><span class=\"mw-default-size\" typeof=\"mw:Image\" id=\"mwAg\"><a href=\"./File:Foobar.jpg\" id=\"mwAw\"><img resource=\"./File:Foobar.jpg\" src=\"//upload.wikimedia.org/wikipedia/commons/3/3a/Foobar.jpg\" data-file-width=\"240\" data-file-height=\"28\" data-file-type=\"bitmap\" height=\"28\" width=\"240\" id=\"mwBA\"/></a></span></p>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse('[[File:Foobar.jpg]]'))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should give precedence to modified data-mw over original (captions 3)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<p><span class=\"mw-default-size\" typeof=\"mw:Image\" id=\"mwAg\"><a href=\"./File:Foobar.jpg\" id=\"mwAw\"><img resource=\"./File:Foobar.jpg\" src=\"//upload.wikimedia.org/wikipedia/commons/3/3a/Foobar.jpg\" data-file-width=\"240\" data-file-height=\"28\" data-file-type=\"bitmap\" height=\"28\" width=\"240\" id=\"mwBA\"/></a></span></p>',\n\t\t\t\t'data-mw': {\n\t\t\t\t\tbody: {\n\t\t\t\t\t\tids: {\n\t\t\t\t\t\t\t\"mwAg\": {},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Doesnotexist',\n\t\t\t\t\t'data-parsoid': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: {\n\t\t\t\t\t\t\t\t\"mwAg\": { \"optList\": [{ \"ck\": \"caption\", \"ak\": \"Testing 123\" }] },\n\t\t\t\t\t\t\t\t\"mwAw\": { \"a\": { \"href\": \"./File:Foobar.jpg\" }, \"sa\": {} },\n\t\t\t\t\t\t\t\t\"mwBA\": { \"a\": { \"resource\": \"./File:Foobar.jpg\", \"height\": \"28\", \"width\": \"240\" },\"sa\": { \"resource\": \"File:Foobar.jpg\" } },\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t'data-mw': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: {\n\t\t\t\t\t\t\t\t\"mwAg\": { \"caption\": \"Testing 123\" },\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/999.0.0\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<p><span class=\"mw-default-size\" typeof=\"mw:Image\" id=\"mwAg\"><a href=\"./File:Foobar.jpg\" id=\"mwAw\"><img resource=\"./File:Foobar.jpg\" src=\"//upload.wikimedia.org/wikipedia/commons/3/3a/Foobar.jpg\" data-file-width=\"240\" data-file-height=\"28\" data-file-type=\"bitmap\" height=\"28\" width=\"240\" id=\"mwBA\"/></a></span></p>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(validWikitextResponse('[[File:Foobar.jpg]]'))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should apply extra normalizations (scrub_wikitext)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/html/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<h2></h2>',\n\t\t\t\tscrub_wikitext: true,\n\t\t\t\toriginal: { title: 'Doesnotexist' },\n\t\t\t})\n\t\t\t.expect(validWikitextResponse(\n\t\t\t\t''\n\t\t\t))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should apply extra normalizations (scrubWikitext)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/html/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<h2></h2>',\n\t\t\t\tscrubWikitext: true,\n\t\t\t\toriginal: { title: 'Doesnotexist' },\n\t\t\t})\n\t\t\t.expect(validWikitextResponse(\n\t\t\t\t''\n\t\t\t))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should suppress extra normalizations', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/html/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<h2></h2>',\n\t\t\t\toriginal: { title: 'Doesnotexist' },\n\t\t\t})\n\t\t\t.expect(validWikitextResponse(\n\t\t\t\t'==<nowiki/>==\\n'\n\t\t\t))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should return a request too large error', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/html/to/wikitext/')\n\t\t\t.send({\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Large_Page',\n\t\t\t\t},\n\t\t\t\thtml: \"a\".repeat(parsoidOptions.limits.html2wt.maxHTMLSize + 1),\n\t\t\t})\n\t\t\t.expect(413)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should fail to downgrade the original version for an unknown transition', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/wikitext/')\n\t\t\t.send({\n\t\t\t\thtml: '<!DOCTYPE html>\\n<html><head><meta charset=\"utf-8\"/><meta property=\"mw:html:version\" content=\"2.2.0\"/></head><body id=\"mwAA\" lang=\"en\" class=\"mw-content-ltr sitedir-ltr ltr mw-body-content parsoid-body mediawiki mw-parser-output\" dir=\"ltr\">123</body></html>',\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Doesnotexist',\n\t\t\t\t\t'data-parsoid': { body: { \"ids\": {} } },\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/2222.0.0\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<!DOCTYPE html>\\n<html><head><meta charset=\"utf-8\"/><meta property=\"mw:html:version\" content=\"2222.0.0\"/></head><body id=\"mwAA\" lang=\"en\" class=\"mw-content-ltr sitedir-ltr ltr mw-body-content parsoid-body mediawiki mw-parser-output\" dir=\"ltr\">123</body></html>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(400)\n\t\t\t.end(done);\n\t\t});\n\n\t}); // end html2wt\n\n\tdescribe('pb2pb', function() {\n\n\t\tit('should require an original or previous version', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/pagebundle/Reuse_Page/100')\n\t\t\t.send({})\n\t\t\t.expect(400)\n\t\t\t.end(done);\n\t\t});\n\n\t\tvar previousRevHTML = {\n\t\t\trevid: 99,\n\t\t\thtml: {\n\t\t\t\theaders: {\n\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + defaultContentVersion + '\"',\n\t\t\t\t},\n\t\t\t\tbody: '<p about=\"#mwt1\" typeof=\"mw:Transclusion\" data-mw=\\'{\"parts\":[{\"template\":{\"target\":{\"wt\":\"colours of the rainbow\",\"href\":\"./Template:Colours_of_the_rainbow\"},\"params\":{},\"i\":0}}]}\\' id=\"mwAg\">pink</p>',\n\t\t\t},\n\t\t\t\"data-parsoid\": {\n\t\t\t\theaders: {\n\t\t\t\t\t'content-type': 'application/json;profile=\"https://www.mediawiki.org/wiki/Specs/data-parsoid/' + defaultContentVersion + '\"',\n\t\t\t\t},\n\t\t\t\tbody: {\n\t\t\t\t\t'counter': 2,\n\t\t\t\t\t'ids': {\n\t\t\t\t\t\t'mwAg': { 'pi': [[]], 'src': '{{colours of the rainbow}}' },  // artificially added src\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t};\n\n\t\tit('should error when revision not found (transform, pb2pb)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/pagebundle/Doesnotexist')\n\t\t\t.send({\n\t\t\t\tprevious: previousRevHTML,\n\t\t\t})\n\t\t\t.expect(404)\n\t\t\t.end(done);\n\t\t});\n\n\t\t// FIXME: Expansion reuse wasn't ported, see T98995\n\t\tit.skip('should accept the previous revision to reuse expansions', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/pagebundle/Reuse_Page/100')\n\t\t\t.send({\n\t\t\t\tprevious: previousRevHTML,\n\t\t\t})\n\t\t\t.expect(validPageBundleResponse(function(doc) {\n\t\t\t\tdoc.body.firstChild.textContent.should.match(/pink/);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tvar origHTML = Util.clone(previousRevHTML);\n\t\torigHTML.revid = 100;\n\n\t\t// FIXME: Expansion reuse wasn't ported, see T98995\n\t\tit.skip('should accept the original and reuse certain expansions', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/pagebundle/Reuse_Page/100')\n\t\t\t.send({\n\t\t\t\tupdates: {\n\t\t\t\t\ttransclusions: true,\n\t\t\t\t},\n\t\t\t\toriginal: origHTML,\n\t\t\t})\n\t\t\t.expect(validPageBundleResponse(function(doc) {\n\t\t\t\tdoc.body.firstChild.textContent.should.match(/purple/);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should refuse an unknown conversion (2.x -> 999.x)', function(done) {\n\t\t\tpreviousRevHTML.html.headers['content-type'].should.equal('text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/2.2.0\"');\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/pagebundle/Reuse_Page/100')\n\t\t\t.set('Accept', 'application/json; profile=\"https://www.mediawiki.org/wiki/Specs/pagebundle/999.0.0\"')\n\t\t\t.send({\n\t\t\t\tprevious: previousRevHTML,\n\t\t\t})\n\t\t\t.expect(415)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should downgrade 999.x content to 2.x', function(done) {\n\t\t\tvar contentVersion = '2.2.0';\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/pagebundle/')\n\t\t\t.set('Accept', 'application/json; profile=\"https://www.mediawiki.org/wiki/Specs/pagebundle/' + contentVersion + '\"')\n\t\t\t.send({\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Doesnotexist',\n\t\t\t\t\t'data-parsoid': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: { \"mwAQ\": { \"pi\": [[{ \"k\": \"1\" }]] } },\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t'data-mw': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: { \"mwAQ\": { \"parts\": [{ \"template\": { \"target\": { \"wt\": \"1x\", \"href\": \"./Template:1x\" }, \"params\": { \"1\": { \"wt\": \"hi\" } }, \"i\": 0 } }] } },\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/999.0.0\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<!DOCTYPE html>\\n<html><head><meta charset=\"utf-8\"/><meta property=\"mw:html:version\" content=\"999.0.0\"/></head><body><p about=\"#mwt1\" typeof=\"mw:Transclusion\" id=\"mwAQ\">ho</p></body></html>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(200)\n\t\t\t.expect(acceptablePageBundleResponse(contentVersion, function(html) {\n\t\t\t\t// In < 999.x, data-mw is still inline.\n\t\t\t\thtml.should.match(/\\s+data-mw\\s*=\\s*['\"]/);\n\t\t\t\thtml.should.not.match(/\\s+data-parsoid\\s*=\\s*['\"]/);\n\t\t\t\tvar doc = domino.createDocument(html);\n\t\t\t\tvar meta = doc.querySelector('meta[property=\"mw:html:version\"]');\n\t\t\t\tmeta.getAttribute('content').should.equal(contentVersion);\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept the original and update the redlinks', function(done) {\n\t\t\t// NOTE: Keep this on an older version to show that it's preserved\n\t\t\t// through the transformation.\n\t\t\tvar contentVersion = '2.0.0';\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/pagebundle/')\n\t\t\t.send({\n\t\t\t\tupdates: {\n\t\t\t\t\tredlinks: true,\n\t\t\t\t},\n\t\t\t\toriginal: {\n\t\t\t\t\ttitle: 'Doesnotexist',\n\t\t\t\t\t'data-parsoid': {\n\t\t\t\t\t\tbody: {\n\t\t\t\t\t\t\tids: {},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + contentVersion + '\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<p><a rel=\"mw:WikiLink\" href=\"./Special:Version\" title=\"Special:Version\">Special:Version</a> <a rel=\"mw:WikiLink\" href=\"./Doesnotexist\" title=\"Doesnotexist\">Doesnotexist</a> <a rel=\"mw:WikiLink\" href=\"./Redirected\" title=\"Redirected\">Redirected</a></p>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(acceptablePageBundleResponse(contentVersion, function(html) {\n\t\t\t\tvar doc = domino.createDocument(html);\n\t\t\t\tdoc.body.querySelectorAll('a').length.should.equal(3);\n\t\t\t\tvar redLinks = doc.body.querySelectorAll('.new');\n\t\t\t\tredLinks.length.should.equal(1);\n\t\t\t\tredLinks[0].getAttribute('title').should.equal('Doesnotexist');\n\t\t\t\tvar redirects = doc.body.querySelectorAll('.mw-redirect');\n\t\t\t\tredirects.length.should.equal(1);\n\t\t\t\tredirects[0].getAttribute('title').should.equal('Redirected');\n\t\t\t}))\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should refuse variant conversion on en page', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/pagebundle/')\n\t\t\t.send({\n\t\t\t\tupdates: {\n\t\t\t\t\tvariant: { target: 'sr-el' },\n\t\t\t\t},\n\t\t\t\toriginal: {\n\t\t\t\t\trevid: 1,\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + defaultContentVersion + '\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<p>абвг abcd</p>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(400)\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept the original and do variant conversion (given oldid)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/pagebundle/')\n\t\t\t.send({\n\t\t\t\tupdates: {\n\t\t\t\t\tvariant: { target: 'sr-el' },\n\t\t\t\t},\n\t\t\t\toriginal: {\n\t\t\t\t\trevid: 104, /* sets the pagelanguage */\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + defaultContentVersion + '\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<p>абвг abcd x</p>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(200)\n\t\t\t.expect((res) => {\n\t\t\t\t// We don't actually require the result to have data-parsoid\n\t\t\t\t// if the input didn't have data-parsoid; hack the result\n\t\t\t\t// in order to make validPageBundleResponse() pass.\n\t\t\t\tres.body['data-parsoid'].body = {};\n\t\t\t})\n\t\t\t.expect(validPageBundleResponse(function(doc) {\n\t\t\t\tdoc.body.textContent.should.equal('abvg abcd x');\n\t\t\t}))\n\t\t\t.expect((res) => {\n\t\t\t\tconst headers = res.body.html.headers;\n\t\t\t\theaders.should.have.property('content-language');\n\t\t\t\theaders['content-language'].should.equal('sr-el');\n\t\t\t\theaders.should.have.property('vary');\n\t\t\t\theaders.vary.should.match(/\\bAccept-Language\\b/i);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should accept the original and do variant conversion (given pagelanguage)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/pagebundle/')\n\t\t\t.set('Content-Language', 'sr')\n\t\t\t.set('Accept-Language', 'sr-el')\n\t\t\t.send({\n\t\t\t\tupdates: {\n\t\t\t\t\tvariant: { /* target implicit from accept-language */ },\n\t\t\t\t},\n\t\t\t\toriginal: {\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + defaultContentVersion + '\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<p>абвг abcd</p>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(200)\n\t\t\t.expect((res) => {\n\t\t\t\t// We don't actually require the result to have data-parsoid\n\t\t\t\t// if the input didn't have data-parsoid; hack the result\n\t\t\t\t// in order to make validPageBundleResponse() pass.\n\t\t\t\tres.body['data-parsoid'].body = {};\n\t\t\t})\n\t\t\t.expect(validPageBundleResponse(function(doc) {\n\t\t\t\tdoc.body.textContent.should.equal('abvg abcd');\n\t\t\t}))\n\t\t\t.expect((res) => {\n\t\t\t\tconst headers = res.body.html.headers;\n\t\t\t\theaders.should.have.property('content-language');\n\t\t\t\theaders['content-language'].should.equal('sr-el');\n\t\t\t\theaders.should.have.property('vary');\n\t\t\t\theaders.vary.should.match(/\\bAccept-Language\\b/i);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t\tit('should not perform variant conversion w/ invalid variant (given pagelanguage)', function(done) {\n\t\t\tclient.req\n\t\t\t.post(mockDomain + '/v3/transform/pagebundle/to/pagebundle/')\n\t\t\t.set('Content-Language', 'sr')\n\t\t\t.set('Accept-Language', 'sr-BOGUS')\n\t\t\t.send({\n\t\t\t\tupdates: {\n\t\t\t\t\tvariant: { /* target implicit from accept-language */ },\n\t\t\t\t},\n\t\t\t\toriginal: {\n\t\t\t\t\thtml: {\n\t\t\t\t\t\theaders: {\n\t\t\t\t\t\t\t'content-type': 'text/html;profile=\"https://www.mediawiki.org/wiki/Specs/HTML/' + defaultContentVersion + '\"',\n\t\t\t\t\t\t},\n\t\t\t\t\t\tbody: '<p>абвг abcd</p>',\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\t.expect(200)\n\t\t\t.expect((res) => {\n\t\t\t\t// We don't actually require the result to have data-parsoid\n\t\t\t\t// if the input didn't have data-parsoid; hack the result\n\t\t\t\t// in order to make validPageBundleResponse() pass.\n\t\t\t\tres.body['data-parsoid'].body = {};\n\t\t\t})\n\t\t\t.expect(validPageBundleResponse(function(doc) {\n\t\t\t\tdoc.body.textContent.should.equal('абвг abcd');\n\t\t\t}))\n\t\t\t.expect((res) => {\n\t\t\t\tconst headers = res.body.html.headers;\n\t\t\t\theaders.should.have.property('content-language');\n\t\t\t\theaders['content-language'].should.equal('sr');\n\t\t\t\theaders.should.have.property('vary');\n\t\t\t\theaders.vary.should.match(/\\bAccept-Language\\b/i);\n\t\t\t})\n\t\t\t.end(done);\n\t\t});\n\n\t});  // end pb2pb\n\n});\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/mockAPI.js","messages":[{"ruleId":"no-shadow","severity":2,"message":"'text' is already declared in the upper scope.","line":566,"column":13,"nodeType":"Identifier","messageId":"noShadow","endLine":566,"endColumn":17}],"errorCount":1,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"source":"// This file is used to run a stub API that mimics the MediaWiki interface\n// for the purposes of testing extension expansion.\n\n'use strict';\n\nrequire('../core-upgrade.js');\n\nvar fs = require('fs');\nvar yaml = require('js-yaml');\nvar path = require('path');\nvar express = require('express');\nvar crypto = require('crypto');\nvar busboy = require('connect-busboy');\n\nvar Promise = require('../lib/utils/promise.js');\n\n// Get Parsoid limits.\nvar optionsPath = path.resolve(__dirname, './test.config.yaml');\nvar optionsYaml = fs.readFileSync(optionsPath, 'utf8');\nvar parsoidOptions = yaml.load(optionsYaml).services[0].conf;\n\n// configuration to match PHP parserTests\nvar IMAGE_BASE_URL = 'http://example.com/images';\nvar IMAGE_DESC_URL = IMAGE_BASE_URL;\n// IMAGE_BASE_URL='http://upload.wikimedia.org/wikipedia/commons';\n// IMAGE_DESC_URL='http://commons.wikimedia.org/wiki';\nvar FILE_PROPS = {\n\t'Foobar.jpg': {\n\t\tsize: 7881,\n\t\twidth: 1941,\n\t\theight: 220,\n\t\tbits: 8,\n\t\tmime: 'image/jpeg',\n\t},\n\t'Thumb.png': {\n\t\tsize: 22589,\n\t\twidth: 135,\n\t\theight: 135,\n\t\tbits: 8,\n\t\tmime: 'image/png',\n\t},\n\t'Foobar.svg': {\n\t\tsize: 12345,\n\t\twidth: 240,\n\t\theight: 180,\n\t\tbits: 24,\n\t\tmime: 'image/svg+xml',\n\t},\n\t'LoremIpsum.djvu': {\n\t\tsize: 3249,\n\t\twidth: 2480,\n\t\theight: 3508,\n\t\tbits: 8,\n\t\tmime: 'image/vnd.djvu',\n\t},\n\t'Video.ogv': {\n\t\tsize: 12345,\n\t\twidth: 320,\n\t\theight: 240,\n\t\tbits: 0,\n\t\tduration: 160.733333333333,\n\t\tmime: 'application/ogg',\n\t\tmediatype: 'VIDEO',\n\t},\n\t'Audio.oga': {\n\t\tsize: 12345,\n\t\twidth: 0,\n\t\theight: 0,\n\t\tbits: 0,\n\t\tduration: 160.733333333333,\n\t\tmime: 'application/ogg',\n\t\tmediatype: 'AUDIO',\n\t},\n};\n\n/* -------------------- web app access points below --------------------- */\n\nvar app = express();\n\n// application/x-www-form-urlencoded\n// multipart/form-data\napp.use(busboy({\n\tlimits: {\n\t\tfields: 10,\n\t\tfieldSize: 15 * 1024 * 1024,\n\t},\n}));\napp.use(function(req, res, next) {\n\treq.body = req.body || {};\n\tif (!req.busboy) {\n\t\treturn next();\n\t}\n\treq.busboy.on('field', function(field, val) {\n\t\treq.body[field] = val;\n\t});\n\treq.busboy.on('finish', function() {\n\t\tnext();\n\t});\n\treq.pipe(req.busboy);\n});\n\nvar mainPage = {\n\tquery: {\n\t\tpages: {\n\t\t\t'1': {\n\t\t\t\tpageid: 1,\n\t\t\t\tns: 0,\n\t\t\t\ttitle: 'Main Page',\n\t\t\t\trevisions: [\n\t\t\t\t\t{\n\t\t\t\t\t\trevid: 1,\n\t\t\t\t\t\tparentid: 0,\n\t\t\t\t\t\tslots: {\n\t\t\t\t\t\t\tmain: {\n\t\t\t\t\t\t\t\tcontentmodel: 'wikitext',\n\t\t\t\t\t\t\t\tcontentformat: 'text/x-wiki',\n\t\t\t\t\t\t\t\t'*': '<strong>MediaWiki has been successfully installed.</strong>\\n\\nConsult the [//meta.wikimedia.org/wiki/Help:Contents User\\'s Guide] for information on using the wiki software.\\n\\n== Getting started ==\\n* [//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings Configuration settings list]\\n* [//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ MediaWiki FAQ]\\n* [https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce MediaWiki release mailing list]\\n* [//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources Localise MediaWiki for your language]',\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t],\n\t\t\t},\n\t\t},\n\t},\n};\n\n// Old response structure, pre-mcr\nvar oldResponse = {\n\tquery: {\n\t\tpages: {\n\t\t\t'999': {\n\t\t\t\tpageid: 999,\n\t\t\t\tns: 0,\n\t\t\t\ttitle: 'Old Response',\n\t\t\t\trevisions: [\n\t\t\t\t\t{\n\t\t\t\t\t\trevid: 999,\n\t\t\t\t\t\tparentid: 0,\n\t\t\t\t\t\tcontentmodel: 'wikitext',\n\t\t\t\t\t\tcontentformat: 'text/x-wiki',\n\t\t\t\t\t\t'*': '<strong>MediaWiki was successfully installed.</strong>\\n\\nConsult the [//meta.wikimedia.org/wiki/Help:Contents User\\'s Guide] for information on using the wiki software.\\n\\n== Getting started ==\\n* [//www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings Configuration settings list]\\n* [//www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ MediaWiki FAQ]\\n* [https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce MediaWiki release mailing list]\\n* [//www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources Localise MediaWiki for your language]',\n\t\t\t\t\t},\n\t\t\t\t],\n\t\t\t},\n\t\t},\n\t},\n};\n\nvar junkPage = {\n\tquery: {\n\t\tpages: {\n\t\t\t'2': {\n\t\t\t\tpageid: 2,\n\t\t\t\tns: 0,\n\t\t\t\ttitle: \"Junk Page\",\n\t\t\t\trevisions: [\n\t\t\t\t\t{\n\t\t\t\t\t\trevid: 2,\n\t\t\t\t\t\tparentid: 0,\n\t\t\t\t\t\tslots: {\n\t\t\t\t\t\t\tmain: {\n\t\t\t\t\t\t\t\tcontentmodel: 'wikitext',\n\t\t\t\t\t\t\t\tcontentformat: 'text/x-wiki',\n\t\t\t\t\t\t\t\t'*': '2. This is just some junk. See the comment above.',\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t],\n\t\t\t},\n\t\t},\n\t},\n};\n\nvar largePage = {\n\tquery: {\n\t\tpages: {\n\t\t\t'3': {\n\t\t\t\tpageid: 3,\n\t\t\t\tns: 0,\n\t\t\t\ttitle: 'Large_Page',\n\t\t\t\trevisions: [\n\t\t\t\t\t{\n\t\t\t\t\t\trevid: 3,\n\t\t\t\t\t\tparentid: 0,\n\t\t\t\t\t\tslots: {\n\t\t\t\t\t\t\tmain: {\n\t\t\t\t\t\t\t\tcontentmodel: 'wikitext',\n\t\t\t\t\t\t\t\tcontentformat: 'text/x-wiki',\n\t\t\t\t\t\t\t\t'*': 'a'.repeat(parsoidOptions.limits.wt2html.maxWikitextSize + 1),\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t],\n\t\t\t},\n\t\t},\n\t},\n};\n\nvar reusePage = {\n\tquery: {\n\t\tpages: {\n\t\t\t'100': {\n\t\t\t\tpageid: 100,\n\t\t\t\tns: 0,\n\t\t\t\ttitle: 'Reuse_Page',\n\t\t\t\trevisions: [\n\t\t\t\t\t{\n\t\t\t\t\t\trevid: 100,\n\t\t\t\t\t\tparentid: 0,\n\t\t\t\t\t\tslots: {\n\t\t\t\t\t\t\tmain: {\n\t\t\t\t\t\t\t\tcontentmodel: 'wikitext',\n\t\t\t\t\t\t\t\tcontentformat: 'text/x-wiki',\n\t\t\t\t\t\t\t\t'*': '{{colours of the rainbow}}',\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t],\n\t\t\t},\n\t\t},\n\t},\n};\n\nvar jsonPage = {\n\tquery: {\n\t\tpages: {\n\t\t\t'101': {\n\t\t\t\tpageid: 101,\n\t\t\t\tns: 0,\n\t\t\t\ttitle: 'JSON_Page',\n\t\t\t\trevisions: [\n\t\t\t\t\t{\n\t\t\t\t\t\trevid: 101,\n\t\t\t\t\t\tparentid: 0,\n\t\t\t\t\t\tslots: {\n\t\t\t\t\t\t\tmain: {\n\t\t\t\t\t\t\t\tcontentmodel: 'json',\n\t\t\t\t\t\t\t\tcontentformat: 'text/json',\n\t\t\t\t\t\t\t\t'*': '[1]',\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t],\n\t\t\t},\n\t\t},\n\t},\n};\n\nvar lintPage = {\n\tquery: {\n\t\tpages: {\n\t\t\t'102': {\n\t\t\t\tpageid: 102,\n\t\t\t\tns: 0,\n\t\t\t\ttitle: \"Lint Page\",\n\t\t\t\trevisions: [\n\t\t\t\t\t{\n\t\t\t\t\t\trevid: 102,\n\t\t\t\t\t\tparentid: 0,\n\t\t\t\t\t\tslots: {\n\t\t\t\t\t\t\tmain: {\n\t\t\t\t\t\t\t\tcontentmodel: 'wikitext',\n\t\t\t\t\t\t\t\tcontentformat: 'text/x-wiki',\n\t\t\t\t\t\t\t\t'*': '{|\\nhi\\n|ho\\n|}',\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t],\n\t\t\t},\n\t\t},\n\t},\n};\n\nvar redlinksPage = {\n\tquery: {\n\t\tpages: {\n\t\t\t'103': {\n\t\t\t\tpageid: 103,\n\t\t\t\tns: 0,\n\t\t\t\ttitle: \"Redlinks Page\",\n\t\t\t\trevisions: [\n\t\t\t\t\t{\n\t\t\t\t\t\trevid: 103,\n\t\t\t\t\t\tparentid: 0,\n\t\t\t\t\t\tslots: {\n\t\t\t\t\t\t\tmain: {\n\t\t\t\t\t\t\t\tcontentmodel: 'wikitext',\n\t\t\t\t\t\t\t\tcontentformat: 'text/x-wiki',\n\t\t\t\t\t\t\t\t'*': '[[Special:Version]] [[Doesnotexist]] [[Redirected]]',\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t],\n\t\t\t},\n\t\t},\n\t},\n};\n\nvar variantPage = {\n\tquery: {\n\t\tpages: {\n\t\t\t'104': {\n\t\t\t\tpageid: 104,\n\t\t\t\tns: 0,\n\t\t\t\ttitle: \"Variant Page\",\n\t\t\t\trevisions: [\n\t\t\t\t\t{\n\t\t\t\t\t\trevid: 104,\n\t\t\t\t\t\tparentid: 0,\n\t\t\t\t\t\tslots: {\n\t\t\t\t\t\t\tmain: {\n\t\t\t\t\t\t\t\tcontentmodel: 'wikitext',\n\t\t\t\t\t\t\t\tcontentformat: 'text/x-wiki',\n\t\t\t\t\t\t\t\t'*': 'абвг abcd',\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t],\n\t\t\t\tpagelanguage: 'sr',\n\t\t\t\tpagelanguagedir: 'ltr',\n\t\t\t},\n\t\t},\n\t},\n};\n\nvar noVariantPage = {\n\tquery: {\n\t\tpages: {\n\t\t\t'105': {\n\t\t\t\tpageid: 105,\n\t\t\t\tns: 0,\n\t\t\t\ttitle: \"No Variant Page\",\n\t\t\t\trevisions: [\n\t\t\t\t\t{\n\t\t\t\t\t\trevid: 105,\n\t\t\t\t\t\tparentid: 0,\n\t\t\t\t\t\tslots: {\n\t\t\t\t\t\t\tmain: {\n\t\t\t\t\t\t\t\tcontentmodel: 'wikitext',\n\t\t\t\t\t\t\t\tcontentformat: 'text/x-wiki',\n\t\t\t\t\t\t\t\t'*': 'абвг abcd\\n__NOCONTENTCONVERT__',\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t],\n\t\t\t\tpagelanguage: 'sr',\n\t\t\t\tpagelanguagedir: 'ltr',\n\t\t\t},\n\t\t},\n\t},\n};\n\nvar revisionPage = {\n\tquery: {\n\t\tpages: {\n\t\t\t'63': {\n\t\t\t\tpageid: 63,\n\t\t\t\tns: 0,\n\t\t\t\ttitle: 'Revision ID',\n\t\t\t\trevisions: [\n\t\t\t\t\t{\n\t\t\t\t\t\trevid: 63,\n\t\t\t\t\t\tparentid: 0,\n\t\t\t\t\t\tslots: {\n\t\t\t\t\t\t\tmain: {\n\t\t\t\t\t\t\t\tcontentmodel: 'wikitext',\n\t\t\t\t\t\t\t\tcontentformat: 'text/x-wiki',\n\t\t\t\t\t\t\t\t'*': '{{REVISIONID}}',\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t],\n\t\t\t},\n\t\t},\n\t},\n};\n\nvar fnames = {\n\t'Image:Foobar.jpg': 'Foobar.jpg',\n\t'File:Foobar.jpg': 'Foobar.jpg',\n\t'Archivo:Foobar.jpg': 'Foobar.jpg',\n\t'Mynd:Foobar.jpg': 'Foobar.jpg',\n\t'Датотека:Foobar.jpg': 'Foobar.jpg',\n\t'Image:Foobar.svg': 'Foobar.svg',\n\t'File:Foobar.svg': 'Foobar.svg',\n\t'Image:Thumb.png': 'Thumb.png',\n\t'File:Thumb.png': 'Thumb.png',\n\t'File:LoremIpsum.djvu': 'LoremIpsum.djvu',\n\t'File:Video.ogv': 'Video.ogv',\n\t'File:Audio.oga': 'Audio.oga',\n};\n\nvar pnames = {\n\t'Image:Foobar.jpg': 'File:Foobar.jpg',\n\t'Image:Foobar.svg': 'File:Foobar.svg',\n\t'Image:Thumb.png': 'File:Thumb.png',\n};\n\n// This templatedata description only provides a subset of fields\n// that mediawiki API returns. Parsoid only uses the format and\n// paramOrder fields at this point, so keeping these lean.\nvar templateData = {\n\t'Template:NoFormatWithParamOrder': {\n\t\t'paramOrder': ['f0', 'f1', 'unused2', 'f2', 'unused3'],\n\t},\n\t'Template:InlineTplNoParamOrder': {\n\t\t'format': 'inline',\n\t},\n\t'Template:BlockTplNoParamOrder': {\n\t\t'format': 'block',\n\t},\n\t'Template:InlineTplWithParamOrder': {\n\t\t'format': 'inline',\n\t\t'paramOrder': ['f1','f2'],\n\t},\n\t'Template:BlockTplWithParamOrder': {\n\t\t'format': 'block',\n\t\t'paramOrder': ['f1','f2'],\n\t},\n\t'Template:WithParamOrderAndAliases': {\n\t\t'params': {\n\t\t\t'f1': { 'aliases': ['f4','f3'] }\n\t\t},\n\t\t'paramOrder': ['f1','f2'],\n\t},\n\t'Template:InlineFormattedTpl_1': {\n\t\t'format': '{{_|_=_}}',\n\t},\n\t'Template:InlineFormattedTpl_2': {\n\t\t'format': '\\n{{_ | _ = _}}',\n\t},\n\t'Template:InlineFormattedTpl_3': {\n\t\t'format': '{{_| _____ = _}}',\n\t},\n\t'Template:BlockFormattedTpl_1': {\n\t\t'format': '{{_\\n| _ = _\\n}}',\n\t},\n\t'Template:BlockFormattedTpl_2': {\n\t\t'format': '\\n{{_\\n| _ = _\\n}}\\n',\n\t},\n\t'Template:BlockFormattedTpl_3': {\n\t\t'format': '{{_|\\n _____ = _}}',\n\t},\n};\n\nvar formatters = {\n\tjson: function(data) {\n\t\treturn JSON.stringify(data);\n\t},\n\tjsonfm: function(data) {\n\t\treturn JSON.stringify(data, null, 2);\n\t},\n};\n\nvar preProcess = function(text, revid, formatversion) {\n\tvar match = text.match(/{{1x\\|(.*?)}}/);\n\tif (match) {\n\t\treturn { wikitext: match[1] };\n\t} else if (text === '{{colours of the rainbow}}') {\n\t\treturn { wikitext: 'purple' };\n\t} else if (text === '{{REVISIONID}}') {\n\t\treturn { wikitext: String(revid) };\n\t} else {\n\t\treturn null;\n\t}\n};\n\nvar imageInfo = function(filename, twidth, theight, useBatchAPI) {\n\tvar normPagename = pnames[filename] || filename;\n\tvar normFilename = fnames[filename] || filename;\n\tif (!(normFilename in FILE_PROPS)) {\n\t\treturn null;\n\t}\n\tvar props = FILE_PROPS[normFilename] || Object.create(null);\n\tvar md5 = crypto.createHash('md5').update(normFilename).digest('hex');\n\tvar md5prefix = md5[0] + '/' + md5[0] + md5[1] + '/';\n\tvar baseurl = IMAGE_BASE_URL + '/' + md5prefix + normFilename;\n\tvar height = props.hasOwnProperty('height') ? props.height : 220;\n\tvar width = props.hasOwnProperty('width') ? props.width : 1941;\n\tvar turl = IMAGE_BASE_URL + '/thumb/' + md5prefix + normFilename;\n\tvar durl = IMAGE_DESC_URL + '/' + normFilename;\n\tvar mediatype = props.mediatype ||\n\t\t\t(props.mime === 'image/svg+xml' ? 'DRAWING' : 'BITMAP');\n\tvar result = {\n\t\tsize: props.size || 12345,\n\t\theight: height,\n\t\twidth: width,\n\t\turl: baseurl,\n\t\tdescriptionurl: durl,\n\t\tmediatype: mediatype,\n\t\tmime: props.mime,\n\t};\n\tif (props.hasOwnProperty('duration')) {\n\t\tresult.duration = props.duration;\n\t}\n\t// The batch api always generates thumbs, as does the videoinfo handler\n\tif ((useBatchAPI || result.mediatype === 'VIDEO') &&\n\t\t\t(theight === undefined || theight === null) &&\n\t\t\t(twidth === undefined || twidth === null)) {\n\t\ttwidth = width;\n\t\ttheight = height;\n\t}\n\tif ((theight !== undefined && theight !== null) ||\n\t\t\t(twidth !== undefined && twidth !== null)) {\n\t\tif (twidth && (theight === undefined || theight === null)) {\n\t\t\t// File::scaleHeight in PHP\n\t\t\ttheight = Math.round(height * twidth / width);\n\t\t} else if (theight && (twidth === undefined || twidth === null)) {\n\t\t\t// MediaHandler::fitBoxWidth in PHP\n\t\t\t// This is crazy!\n\t\t\tvar idealWidth = width * theight / height;\n\t\t\tvar roundedUp = Math.ceil(idealWidth);\n\t\t\tif (Math.round(roundedUp * height / width) > theight) {\n\t\t\t\ttwidth = Math.floor(idealWidth);\n\t\t\t} else {\n\t\t\t\ttwidth = roundedUp;\n\t\t\t}\n\t\t} else {\n\t\t\tif (Math.round(height * twidth / width) > theight) {\n\t\t\t\ttwidth = Math.ceil(width * theight / height);\n\t\t\t} else {\n\t\t\t\ttheight = Math.round(height * twidth / width);\n\t\t\t}\n\t\t}\n\t\tconsole.assert(typeof (twidth) === 'number');\n\t\tvar urlWidth = twidth;\n\t\tif (twidth > width) {\n\t\t\t// The PHP api won't enlarge a bitmap ... but the batch api will.\n\t\t\t// But, to match the PHP sections, don't scale.\n\t\t\tif (mediatype !== 'DRAWING') {\n\t\t\t\turlWidth = width;\n\t\t\t}\n\t\t}\n\t\tif (urlWidth !== width || ['AUDIO', 'VIDEO'].includes(mediatype)) {\n\t\t\tturl += '/' + urlWidth + 'px-' + normFilename;\n\t\t\tswitch (mediatype) {\n\t\t\t\tcase 'AUDIO':\n\t\t\t\t\t// No thumbs are generated for audio\n\t\t\t\t\tturl = IMAGE_BASE_URL + '/w/resources/assets/file-type-icons/fileicon-ogg.png';\n\t\t\t\t\tbreak;\n\t\t\t\tcase 'VIDEO':\n\t\t\t\t\tturl += '.jpg';\n\t\t\t\t\tbreak;\n\t\t\t\tcase 'DRAWING':\n\t\t\t\t\tturl += '.png';\n\t\t\t\t\tbreak;\n\t\t\t}\n\t\t} else {\n\t\t\tturl = baseurl;\n\t\t}\n\t\tresult.thumbwidth = twidth;\n\t\tresult.thumbheight = theight;\n\t\tresult.thumburl = turl;\n\t}\n\treturn {\n\t\tresult: result,\n\t\tnormPagename: normPagename,\n\t};\n};\n\nvar querySiteinfo = function(prefix, formatversion, cb) {\n\tcb(null, require(`../baseconfig/${formatversion === 2 ? '2/' : ''}${prefix}.json`));\n};\n\nvar parse = function(text, onlypst, formatversion) {\n\tvar fmt = (text) => {\n\t\treturn { text: (formatversion === 2) ? text : { \"*\": text } };\n\t};\n\t// We're performing a subst\n\tif (onlypst) {\n\t\treturn fmt(text.replace(/\\{\\{subst:1x\\|([^}]+)\\}\\}/, \"$1\"));\n\t}\n\t// Render to html the contents of known extension tags\n\tvar match = text.match(/<([A-Za-z][^\\t\\n\\v />\\0]*)/);\n\tswitch ((match && match[1]) || '') {\n\t\t// FIXME: this isn't really used by the mocha tests\n\t\t// since some mocha tests hit the production db, but\n\t\t// when we fix that, they should go through this.\n\t\tcase 'templatestyles':\n\t\t\treturn fmt(\"<style data-mw-deduplicate='TemplateStyles:r123456'>small { font-size: 120% } big { font-size: 80% }</style>\"); // Silliness\n\t\tcase 'translate':\n\t\t\treturn fmt(text);\n\t\tcase 'indicator':\n\t\tcase 'section':\n\t\t\treturn fmt('\\n');\n\t\tdefault:\n\t\t\tthrow new Error(\"Unhandled extension type encountered in: \" + text);\n\t}\n};\n\nvar missingTitles = new Set([\n\t'Doesnotexist',\n]);\n\nvar specialTitles = new Set([\n\t'Special:Version',\n]);\n\nvar redirectTitles = new Set([\n\t'Redirected',\n]);\n\nvar disambigTitles = new Set([\n\t'Disambiguation',\n]);\n\nvar pageProps = function(titles) {\n\tif (!Array.isArray(titles)) { return null; }\n\treturn titles.map(function(t) {\n\t\tvar props = { title: t };\n\t\tif (missingTitles.has(t)) { props.missing = true; }\n\t\tif (specialTitles.has(t)) { props.special = true; }\n\t\tif (redirectTitles.has(t)) { props.redirect = true; }\n\t\tif (disambigTitles.has(t)) {\n\t\t\tprops.linkclasses = [ 'mw-disambig' ];\n\t\t}\n\t\treturn props;\n\t});\n};\n\nconst fv2Queries = new Map();\n\nvar availableActions = {\n\tparse: function(prefix, body, cb) {\n\t\tvar formatversion = +(body.formatversion || 1);\n\t\tvar result = parse(body.text, body.onlypst, formatversion);\n\t\tcb(null, { parse: result });\n\t},\n\n\tquery: function(prefix, body, cb) {\n\t\tvar formatversion = +(body.formatversion || 1);\n\t\tif (body.meta === 'siteinfo') {\n\t\t\treturn querySiteinfo(prefix, formatversion, cb);\n\t\t} else if (body.prop === \"info|pageprops\") {\n\t\t\tconsole.assert(formatversion === 2);\n\t\t\treturn cb(null, {\n\t\t\t\tquery: {\n\t\t\t\t\tpages: pageProps(body.titles.split('|')),\n\t\t\t\t},\n\t\t\t});\n\t\t}\n\t\tconst title = (body.titles || '').replace(/_/g, ' ');\n\t\tconst revid = body.revids;\n\t\tif (body.prop === \"info|revisions\") {\n\t\t\tlet query = null;\n\t\t\tif (revid === \"1\" || title === \"Main Page\") {\n\t\t\t\tquery = mainPage;\n\t\t\t} else if (revid === \"2\" || title === \"Junk Page\") {\n\t\t\t\tquery = junkPage;\n\t\t\t} else if (revid === '3' || title === 'Large Page') {\n\t\t\t\tquery = largePage;\n\t\t\t} else if (revid === '63' || title === 'Revision ID') {\n\t\t\t\tquery = revisionPage;\n\t\t\t} else if (revid === '100' || title === 'Reuse Page') {\n\t\t\t\tquery = reusePage;\n\t\t\t} else if (revid === '101' || title === 'JSON Page') {\n\t\t\t\tquery = jsonPage;\n\t\t\t} else if (revid === '102' || title === 'Lint Page') {\n\t\t\t\tquery = lintPage;\n\t\t\t} else if (revid === '103' || title === 'Redlinks Page') {\n\t\t\t\tquery = redlinksPage;\n\t\t\t} else if (revid === '104' || title === 'Variant Page') {\n\t\t\t\tquery = variantPage;\n\t\t\t} else if (revid === '105' || title === 'No Variant Page') {\n\t\t\t\tquery = noVariantPage;\n\t\t\t} else if (revid === '999' || title === 'Old Response') {\n\t\t\t\tquery = oldResponse;\n\t\t\t} else {\n\t\t\t\tquery = {\n\t\t\t\t\tquery: {\n\t\t\t\t\t\tpages: {\n\t\t\t\t\t\t\t'-1': {\n\t\t\t\t\t\t\t\tns: 6,\n\t\t\t\t\t\t\t\ttitle: title,\n\t\t\t\t\t\t\t\tmissing: '',\n\t\t\t\t\t\t\t\timagerepository: '',\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t};\n\t\t\t}\n\t\t\tif (formatversion === 2) {\n\t\t\t\tif (!fv2Queries.has(query)) {\n\t\t\t\t\tconst clone = JSON.parse(JSON.stringify(query));\n\t\t\t\t\tclone.query.pages = Object.keys(clone.query.pages).reduce((ps, k) => {\n\t\t\t\t\t\tconst page = clone.query.pages[k];\n\t\t\t\t\t\tif (Array.isArray(page.revisions)) {\n\t\t\t\t\t\t\tpage.revisions[0].slots.main = Object.assign(\n\t\t\t\t\t\t\t\t{},\n\t\t\t\t\t\t\t\tpage.revisions[0].slots.main,\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t'*': undefined,\n\t\t\t\t\t\t\t\t\t'content': page.revisions[0].slots.main['*'],\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t);\n\t\t\t\t\t\t\tpage.pagelanguage = page.pagelanguage || 'en';\n\t\t\t\t\t\t\tpage.pagelanguagedir = page.pagelanguagedir || 'ltr';\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tpage.missing = true;\n\t\t\t\t\t\t}\n\t\t\t\t\t\tps.push(page);\n\t\t\t\t\t\treturn ps;\n\t\t\t\t\t}, []);\n\t\t\t\t\tfv2Queries.set(query, clone);\n\t\t\t\t}\n\t\t\t\tquery = fv2Queries.get(query);\n\t\t\t}\n\t\t\treturn cb(null, query);\n\t\t}\n\t\tif (body.prop === 'imageinfo') {\n\t\t\tvar response = { query: { } };\n\t\t\tvar filename = body.titles;\n\t\t\tvar tonum = (x) => {\n\t\t\t\treturn (x === null || x === undefined) ? undefined : (+x);\n\t\t\t};\n\t\t\tvar ii = imageInfo(filename, tonum(body.iiurlwidth), tonum(body.iiurlheight), false);\n\t\t\tvar p;\n\t\t\tif (ii === null) {\n\t\t\t\tp = {\n\t\t\t\t\tns: 6,\n\t\t\t\t\ttitle: filename,\n\t\t\t\t\tmissing: '',\n\t\t\t\t\timagerepository: '',\n\t\t\t\t\timageinfo: [{\n\t\t\t\t\t\tsize: 0,\n\t\t\t\t\t\twidth: 0,\n\t\t\t\t\t\theight: 0,\n\t\t\t\t\t\tfilemissing: '',\n\t\t\t\t\t\tmime: null,\n\t\t\t\t\t\tmediatype: null\n\t\t\t\t\t}],\n\t\t\t\t};\n\t\t\t\tif (formatversion === 2) {\n\t\t\t\t\tp.missing = p.imageinfo.filemissing = true;\n\t\t\t\t\tp.badfile = false;\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tif (filename !== ii.normPagename) {\n\t\t\t\t\tresponse.query.normalized = [{ from: filename, to: ii.normPagename }];\n\t\t\t\t}\n\t\t\t\tp = {\n\t\t\t\t\tpageid: 1,\n\t\t\t\t\tns: 6,\n\t\t\t\t\ttitle: ii.normPagename,\n\t\t\t\t\timageinfo: [ii.result],\n\t\t\t\t};\n\t\t\t\tif (formatversion === 2) {\n\t\t\t\t\tp.badfile = false;\n\t\t\t\t}\n\t\t\t}\n\t\t\tif (formatversion === 2) {\n\t\t\t\tresponse.query.pages = [ p ];\n\t\t\t} else {\n\t\t\t\tresponse.query.pages = { };\n\t\t\t\tresponse.query.pages[p.pageid || '-1'] = p;\n\t\t\t}\n\t\t\treturn cb(null, response);\n\t\t}\n\t\treturn cb(new Error('Uh oh!'));\n\t},\n\n\texpandtemplates: function(prefix, body, cb) {\n\t\tvar formatversion = +(body.formatversion || 1);\n\t\tvar res = preProcess(body.text, body.revid, formatversion);\n\t\tif (res === null) {\n\t\t\tcb(new Error('Sorry!'));\n\t\t} else {\n\t\t\tcb(null, { expandtemplates: res });\n\t\t}\n\t},\n\n\t'parsoid-batch': function(prefix, body, cb) {\n\t\tvar formatversion = +(body.formatversion || 1);\n\t\tvar batch;\n\t\ttry {\n\t\t\tbatch = JSON.parse(body.batch);\n\t\t\tconsole.assert(Array.isArray(batch));\n\t\t} catch (e) {\n\t\t\treturn cb(e);\n\t\t}\n\t\tvar errs = [];\n\t\tvar results = batch.map(function(b) {\n\t\t\tvar res = null;\n\t\t\tswitch (b.action) {\n\t\t\t\tcase 'preprocess':\n\t\t\t\t\tres = preProcess(b.text, b.revid, formatversion);\n\t\t\t\t\tbreak;\n\t\t\t\tcase 'imageinfo':\n\t\t\t\t\tvar txopts = b.txopts || {};\n\t\t\t\t\tvar ii = imageInfo('File:' + b.filename, txopts.width, txopts.height, true);\n\t\t\t\t\t// NOTE: Return early here since a null is acceptable.\n\t\t\t\t\treturn (ii !== null) ? ii.result : null;\n\t\t\t\tcase 'parse':\n\t\t\t\t\tres = parse(b.text, /* onlypst */false, formatversion);\n\t\t\t\t\tbreak;\n\t\t\t\tcase 'pageprops':\n\t\t\t\t\tres = pageProps(b.titles);\n\t\t\t\t\tbreak;\n\t\t\t}\n\t\t\tif (res === null) { errs.push(b); }\n\t\t\treturn res;\n\t\t});\n\t\tvar err = (errs.length > 0) ? new Error(JSON.stringify(errs)) : null;\n\t\tcb(err, { 'parsoid-batch': results });\n\t},\n\n\t// Return a dummy response\n\ttemplatedata: function(prefix, body, cb) {\n\t\tcb(null, {\n\t\t\t// FIXME: Assumes that body.titles is a single title\n\t\t\t// (which is how Parsoid uses this endpoint right now).\n\t\t\t'pages': {\n\t\t\t\t'1': templateData[body.titles] || {},\n\t\t\t},\n\t\t});\n\t},\n\n\tparaminfo: function(prefix, body, cb) {\n\t\tcb(null, { /* Just don't 400 for now. */ });\n\t},\n};\n\nvar actionDefinitions = {\n\tparse: {\n\t\tparameters: {\n\t\t\ttext: 'text',\n\t\t\ttitle: 'text',\n\t\t\tonlypst: 'boolean',\n\t\t},\n\t},\n\tquery: {\n\t\tparameters: {\n\t\t\ttitles: 'text',\n\t\t\tprop: 'text',\n\t\t\tiiprop: 'text',\n\t\t\tiiurlwidth: 'text',\n\t\t\tiiurlheight: 'text',\n\t\t},\n\t},\n};\n\nvar actionRegex = Object.keys(availableActions).join('|');\n\nfunction buildOptions(options) {\n\tvar optStr = '';\n\tfor (var i = 0; i < options.length; i++) {\n\t\toptStr += '<option value=\"' + options[i] + '\">' + options[i] + '</option>';\n\t}\n\treturn optStr;\n}\n\nfunction buildActionList() {\n\tvar actions = Object.keys(availableActions);\n\tvar setStr = '';\n\tfor (var i = 0; i < actions.length; i++) {\n\t\tvar action = actions[i];\n\t\tvar title = 'action=' + action;\n\t\tsetStr += '<li id=\"' + title + '\">';\n\t\tsetStr += '<a href=\"/' + action + '\">' + title + '</a></li>';\n\t}\n\treturn setStr;\n}\n\nfunction buildForm(action) {\n\tvar formStr = '';\n\tvar actionDef = actionDefinitions[action];\n\tvar params = actionDef.parameters;\n\tvar paramList = Object.keys(params);\n\n\tfor (var i = 0; i < paramList.length; i++) {\n\t\tvar param = paramList[i];\n\t\tif (typeof params[param] === 'string') {\n\t\t\tformStr += '<input type=\"' + params[param] + '\" name=\"' + param + '\" />';\n\t\t} else if (params[param].length) {\n\t\t\tformStr += '<select name=\"' + param + '\">';\n\t\t\tformStr += buildOptions(params[param]);\n\t\t\tformStr += '</select>';\n\t\t}\n\t}\n\treturn formStr;\n}\n\n// GET request to root....should probably just tell the client how to use the service\napp.get('/', function(req, res) {\n\tres.setHeader('Content-Type', 'text/html; charset=UTF-8');\n\tres.write(\n\t\t'<html><body>' +\n\t\t\t'<ul id=\"list-of-actions\">' +\n\t\t\t\tbuildActionList() +\n\t\t\t'</ul>' +\n\t\t'</body></html>');\n\tres.end();\n});\n\n// GET requests for any possible actions....tell the client how to use the action\napp.get(new RegExp('^/(' + actionRegex + ')'), function(req, res) {\n\tvar formats = buildOptions(Object.keys(formatters));\n\tvar action = req.params[0];\n\tvar returnHtml =\n\t\t\t'<form id=\"service-form\" method=\"GET\" action=\"api.php\">' +\n\t\t\t\t'<h2>GET form</h2>' +\n\t\t\t\t'<input name=\"action\" type=\"hidden\" value=\"' + action + '\" />' +\n\t\t\t\t'<select name=\"format\">' +\n\t\t\t\t\tformats +\n\t\t\t\t'</select>' +\n\t\t\t\tbuildForm(action) +\n\t\t\t\t'<input type=\"submit\" />' +\n\t\t\t'</form>' +\n\t\t\t'<form id=\"service-form\" method=\"POST\" action=\"api.php\">' +\n\t\t\t\t'<h2>POST form</h2>' +\n\t\t\t\t'<input name=\"action\" type=\"hidden\" value=\"' + action + '\" />' +\n\t\t\t\t'<select name=\"format\">' +\n\t\t\t\t\tformats +\n\t\t\t\t'</select>' +\n\t\t\t\tbuildForm(action) +\n\t\t\t\t'<input type=\"submit\" />' +\n\t\t\t'</form>';\n\n\tres.setHeader('Content-Type', 'text/html; charset=UTF-8');\n\tres.write(returnHtml);\n\tres.end();\n});\n\nfunction handleApiRequest(prefix, body, res) {\n\tvar format = body.format;\n\tvar action = body.action;\n\tvar formatter = formatters[format || \"json\"];\n\n\tif (!availableActions.hasOwnProperty(action)) {\n\t\treturn res.status(400).end(\"Unknown action.\");\n\t}\n\n\tavailableActions[action](prefix, body, function(err, data) {\n\t\tif (err === null) {\n\t\t\tres.setHeader('Content-Type', 'application/json');\n\t\t\tres.write(formatter(data));\n\t\t\tres.end();\n\t\t} else {\n\t\t\tres.setHeader('Content-Type', 'text/plain');\n\t\t\tres.status(err.httpStatus || 500);\n\t\t\tres.write(err.stack || err.toString());\n\t\t\tres.end();\n\t\t}\n\t});\n}\n\n// GET request to api.php....actually perform an API request\napp.get('/:prefix/api.php', function(req, res) {\n\thandleApiRequest(req.params.prefix, req.query, res);\n});\napp.get('/api.php', function(req, res) {\n\thandleApiRequest('enwiki', req.query, res);\n});\n\n// POST request to api.php....actually perform an API request\napp.post('/:prefix/api.php', function(req, res) {\n\thandleApiRequest(req.params.prefix, req.body, res);\n});\napp.post('/api.php', function(req, res) {\n\thandleApiRequest('enwiki', req.body, res);\n});\n\nconst start = function(options) {\n\tvar logger = options.logger;\n\tvar server;\n\treturn new Promise(function(resolve, reject) {\n\t\tapp.on('error', function(err) {\n\t\t\tlogger.log('error', err);\n\t\t\treject(err);\n\t\t});\n\t\tserver = app.listen(options.config.port, options.config.iface, resolve);\n\t})\n\t.then(function() {\n\t\tvar port = server.address().port;\n\t\tlogger.log('info', 'Mock MediaWiki API: Started on ' + port);\n\t\treturn {\n\t\t\tclose: function() {\n\t\t\t\treturn Promise.promisify(server.close, false, server)();\n\t\t\t},\n\t\t\tport: port,\n\t\t};\n\t});\n};\n\nif (require.main === module) {\n\tstart({\n\t\tconfig: { port: process.env.MOCKPORT || 0 },\n\t\tlogger: { log: function(...args) { console.log(...args); } },\n\t})\n\t.catch((e) => { console.error(e); });\n} else {\n\tmodule.exports = start;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/parser/citeParserTests-knownFailures.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/parser/imageMapParserTests-knownFailures.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/parser/langParserTests-knownFailures.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/parser/legacyMediaParserTests-knownFailures.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/parser/parserTests-knownFailures.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/parser/poemParserTests-knownFailures.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/parser/sectionWrappingParserTests-knownFailures.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/parser/selserWrappingParserTests-knownFailures.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/parser/separatorTests-knownFailures.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/parser/tableFixupsParserTests-knownFailures.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/parser/timedMediaHandlerParserTests-knownFailures.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/parserTests.json","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/serviceWrapper.js","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/testreduce/config.example.js","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tests/testreduce/rtTestWrapper.js","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tools/ScriptUtils.js","messages":[{"ruleId":"no-process-exit","severity":2,"message":"Don't use process.exit(); throw an error instead.","line":57,"column":4,"nodeType":"CallExpression","messageId":"noProcessExit","endLine":57,"endColumn":19},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":94,"column":2,"nodeType":"Block","endLine":96,"endColumn":5},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":133,"column":2,"nodeType":"Block","endLine":135,"endColumn":5},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":171,"column":2,"nodeType":"Block","endLine":173,"endColumn":5},{"ruleId":"jsdoc/require-returns","severity":1,"message":"Missing JSDoc @return declaration.","line":276,"column":2,"nodeType":"Block","endLine":287,"endColumn":5},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"opts\" type.","line":285,"column":null,"nodeType":"Block","endLine":285,"endColumn":null},{"ruleId":"jsdoc/require-param-type","severity":1,"message":"Missing JSDoc @param \"defaults\" type.","line":286,"column":null,"nodeType":"Block","endLine":286,"endColumn":null}],"errorCount":1,"warningCount":6,"fixableErrorCount":0,"fixableWarningCount":0,"source":"/**\n * This file contains general utilities for scripts in\n * the bin/, tools/, tests/ directories. This file should\n * not contain any helpers that are needed by code in the\n * lib/ directory.\n *\n * @module\n */\n\n'use strict';\n\nrequire('../core-upgrade.js');\n\nvar Promise = require('../lib/utils/promise.js');\nvar request = Promise.promisify(require('request'), true);\nvar Util = require('../lib/utils/Util.js').Util;\n\nvar ScriptUtils = {\n\t/**\n\t * Split a tracing / debugging flag string into individual flags\n\t * and return them.\n\t *\n\t * @param {Object} origFlag The original flag string.\n\t * @return {Array}\n\t */\n\tsplitFlags: function(origFlag) {\n\t\tvar objFlags = origFlag.split(\",\");\n\t\tif (objFlags.indexOf(\"selser\") !== -1 && objFlags.indexOf(\"wts\") === -1) {\n\t\t\tobjFlags.push(\"wts\");\n\t\t}\n\t\treturn objFlags;\n\t},\n\n\t/**\n\t * Set debugging flags on an object, based on an options object.\n\t *\n\t * @param {Object} parsoidOptions Object to be assigned to the ParsoidConfig.\n\t * @param {Object} cliOpts The options object to use for setting the debug flags.\n\t * @return {Object} The modified object.\n\t */\n\tsetDebuggingFlags: function(parsoidOptions, cliOpts) {\n\t\t// Handle the --help options\n\t\tvar exit = false;\n\t\tif (cliOpts.trace === 'help') {\n\t\t\tconsole.error(ScriptUtils.traceUsageHelp());\n\t\t\texit = true;\n\t\t}\n\t\tif (cliOpts.dump === 'help') {\n\t\t\tconsole.error(ScriptUtils.dumpUsageHelp());\n\t\t\texit = true;\n\t\t}\n\t\tif (cliOpts.debug === 'help') {\n\t\t\tconsole.error(ScriptUtils.debugUsageHelp());\n\t\t\texit = true;\n\t\t}\n\t\tif (exit) {\n\t\t\tprocess.exit(1);\n\t\t}\n\n\t\t// Ok, no help requested: process the options.\n\t\tif (cliOpts.debug !== undefined) {\n\t\t\t// Continue to support generic debugging.\n\t\t\tif (cliOpts.debug === true) {\n\t\t\t\tconsole.warn(\"Warning: Generic debugging, not handler-specific.\");\n\t\t\t\tparsoidOptions.debug = ScriptUtils.booleanOption(cliOpts.debug);\n\t\t\t} else {\n\t\t\t\t// Setting --debug automatically enables --trace\n\t\t\t\tparsoidOptions.debugFlags = ScriptUtils.splitFlags(cliOpts.debug);\n\t\t\t\tparsoidOptions.traceFlags = parsoidOptions.debugFlags;\n\t\t\t}\n\t\t}\n\n\t\tif (cliOpts.trace !== undefined) {\n\t\t\tif (cliOpts.trace === true) {\n\t\t\t\tconsole.warn(\"Warning: Generic tracing is no longer supported. Ignoring --trace flag. Please provide handler-specific tracing flags, e.g. '--trace pre,html5', to turn it on.\");\n\t\t\t} else {\n\t\t\t\t// Add any new trace flags to the list of existing trace flags (if\n\t\t\t\t// any were inherited from debug); otherwise, create a new list.\n\t\t\t\tparsoidOptions.traceFlags = (parsoidOptions.traceFlags || []).concat(ScriptUtils.splitFlags(cliOpts.trace));\n\t\t\t}\n\t\t}\n\n\t\tif (cliOpts.dump !== undefined) {\n\t\t\tif (cliOpts.dump === true) {\n\t\t\t\tconsole.warn(\"Warning: Generic dumping not enabled. Please set a flag.\");\n\t\t\t} else {\n\t\t\t\tparsoidOptions.dumpFlags = ScriptUtils.splitFlags(cliOpts.dump);\n\t\t\t}\n\t\t}\n\n\t\treturn parsoidOptions;\n\t},\n\n\t/**\n\t * Returns a help message for the tracing flags.\n\t */\n\ttraceUsageHelp: function() {\n\t\treturn [\n\t\t\t\"Tracing\",\n\t\t\t\"-------\",\n\t\t\t\"- With one or more comma-separated flags, traces those specific phases\",\n\t\t\t\"- Supported flags:\",\n\t\t\t\"  * pre-peg   : shows input to tokenizer\",\n\t\t\t\"  * peg       : shows tokens emitted by tokenizer\",\n\t\t\t\"  * sync:1    : shows tokens flowing through the post-tokenizer Sync Token Transform Manager\",\n\t\t\t\"  * async:2   : shows tokens flowing through the Async Token Transform Manager\",\n\t\t\t\"  * sync:3    : shows tokens flowing through the post-expansion Sync Token Transform Manager\",\n\t\t\t\"  * tsp       : shows tokens flowing through the TokenStreamPatcher (useful to see in-order token stream)\",\n\t\t\t\"  * list      : shows actions of the list handler\",\n\t\t\t\"  * sanitizer : shows actions of the sanitizer\",\n\t\t\t\"  * pre       : shows actions of the pre handler\",\n\t\t\t\"  * p-wrap    : shows actions of the paragraph wrapper\",\n\t\t\t\"  * html      : shows tokens that are sent to the HTML tree builder\",\n\t\t\t\"  * dsr       : shows dsr computation on the DOM\",\n\t\t\t\"  * tplwrap   : traces template wrapping code (currently only range overlap/nest/merge code)\",\n\t\t\t\"  * wts       : trace actions of the regular wikitext serializer\",\n\t\t\t\"  * selser    : trace actions of the selective serializer\",\n\t\t\t\"  * domdiff   : trace actions of the DOM diffing code\",\n\t\t\t\"  * wt-escape : debug wikitext-escaping\",\n\t\t\t\"  * batcher   : trace API batch aggregation and dispatch\",\n\t\t\t\"  * apirequest: trace all API requests\",\n\t\t\t\"  * time      : trace times for various phases (right now, limited to DOMPP passes)\",\n\t\t\t\"  * time/dompp: trace times for DOM Post processing passes\",\n\t\t\t\"\",\n\t\t\t\"--debug enables tracing of all the above phases except Token Transform Managers\",\n\t\t\t\"\",\n\t\t\t\"Examples:\",\n\t\t\t\"$ node parse --trace pre,p-wrap,html < foo\",\n\t\t\t\"$ node parse --trace sync:3,dsr < foo\",\n\t\t].join('\\n');\n\t},\n\n\t/**\n\t * Returns a help message for the dump flags.\n\t */\n\tdumpUsageHelp: function() {\n\t\treturn [\n\t\t\t\"Dumping state\",\n\t\t\t\"-------------\",\n\t\t\t\"- Dumps state at different points of execution\",\n\t\t\t\"- DOM dumps are always doc.outerHTML\",\n\t\t\t\"- Supported flags:\",\n\t\t\t\"\",\n\t\t\t\"  * tplsrc            : dumps preprocessed template source that will be tokenized (via ?action=expandtemplates)\",\n\t\t\t\"  * extoutput         : dumps HTML output form extensions (via ?action=parse)\",\n\t\t\t\"\",\n\t\t\t\"  --- Dump flags for wt2html DOM passes ---\",\n\t\t\t\"  * dom:pre-XXX       : dumps DOM before pass XXX runs\",\n\t\t\t\"  * dom:post-XXX      : dumps DOM after pass XXX runs\",\n\t\t\t\"\",\n\t\t\t\"    Available passes (in the order they run):\",\n\t\t\t\"\",\n\t\t\t\"      dpload, fostered, tb-fixups, normalize, pwrap, \",\n\t\t\t\"      migrate-metas, pres, migrate-nls, dsr, tplwrap, \",\n\t\t\t\"      dom-unpack, tag:EXT (replace EXT with extension: cite, poem, etc)\",\n\t\t\t\"      sections, heading-ids, lang-converter, linter, \",\n\t\t\t\"      strip-metas, linkclasses, redlinks, downgrade\",\n\t\t\t\"\",\n\t\t\t\"  --- Dump flags for html2wt ---\",\n\t\t\t\"  * dom:post-dom-diff : in selective serialization, dumps DOM after running dom diff\",\n\t\t\t\"  * dom:post-normal   : in serialization, dumps DOM after normalization\",\n\t\t\t\"  * wt2html:limits    : dumps used resources (along with configured limits)\\n\",\n\t\t\t\"--debug dumps state at these different stages\\n\",\n\t\t\t\"Examples:\",\n\t\t\t\"$ node parse --dump dom:pre-dpload,dom:pre-dsr,dom:pre-tplwrap < foo\",\n\t\t\t\"$ node parse --trace html --dump dom:pre-tplwrap < foo\",\n\t\t\t\"\\n\",\n\t\t].join('\\n');\n\t},\n\n\t/**\n\t * Returns a help message for the debug flags.\n\t */\n\tdebugUsageHelp: function() {\n\t\treturn [\n\t\t\t\"Debugging\",\n\t\t\t\"---------\",\n\t\t\t\"- With one or more comma-separated flags, provides more verbose tracing than the equivalent trace flag\",\n\t\t\t\"- Supported flags:\",\n\t\t\t\"  * pre       : shows actions of the pre handler\",\n\t\t\t\"  * wts       : trace actions of the regular wikitext serializer\",\n\t\t\t\"  * selser    : trace actions of the selective serializer\",\n\t\t].join('\\n');\n\t},\n\n\t/**\n\t * Sets templating and processing flags on an object,\n\t * based on an options object.\n\t *\n\t * @param {Object} parsoidOptions Object to be assigned to the ParsoidConfig.\n\t * @param {Object} cliOpts The options object to use for setting the debug flags.\n\t * @return {Object} The modified object.\n\t */\n\tsetTemplatingAndProcessingFlags: function(parsoidOptions, cliOpts) {\n\t\t[\n\t\t\t'fetchConfig',\n\t\t\t'fetchTemplates',\n\t\t\t'fetchImageInfo',\n\t\t\t'expandExtensions',\n\t\t\t'addHTMLTemplateParameters',\n\t\t].forEach(function(c) {\n\t\t\tif (cliOpts[c] !== undefined) {\n\t\t\t\tparsoidOptions[c] = ScriptUtils.booleanOption(cliOpts[c]);\n\t\t\t}\n\t\t});\n\t\tif (cliOpts.usePHPPreProcessor !== undefined) {\n\t\t\tparsoidOptions.usePHPPreProcessor = parsoidOptions.fetchTemplates &&\n\t\t\t\tScriptUtils.booleanOption(cliOpts.usePHPPreProcessor);\n\t\t}\n\t\tif (cliOpts.maxDepth !== undefined) {\n\t\t\tparsoidOptions.maxDepth = typeof (cliOpts.maxdepth) === 'number' ?\n\t\t\t\tcliOpts.maxdepth : parsoidOptions.maxDepth;\n\t\t}\n\t\tif (cliOpts.apiURL) {\n\t\t\tif (!Array.isArray(parsoidOptions.mwApis)) {\n\t\t\t\tparsoidOptions.mwApis = [];\n\t\t\t}\n\t\t\tparsoidOptions.mwApis.push({ prefix: 'customwiki', uri: cliOpts.apiURL });\n\t\t}\n\t\tif (cliOpts.addHTMLTemplateParameters !== undefined) {\n\t\t\tparsoidOptions.addHTMLTemplateParameters =\n\t\t\t\tScriptUtils.booleanOption(cliOpts.addHTMLTemplateParameters);\n\t\t}\n\t\tif (cliOpts.lint) {\n\t\t\tparsoidOptions.linting = true;\n\t\t\tif (!parsoidOptions.linter) {\n\t\t\t\tparsoidOptions.linter = {};\n\t\t\t}\n\t\t\tparsoidOptions.linter.sendAPI = false;\n\t\t}\n\t\tif (cliOpts.useBatchAPI !== undefined) {\n\t\t\tparsoidOptions.useBatchAPI = cliOpts.useBatchAPI;\n\t\t}\n\n\t\treturn parsoidOptions;\n\t},\n\n\t/**\n\t * Parse a boolean option returned by the yargs package.\n\t * The strings 'false' and 'no' are also treated as false values.\n\t * This allows `--debug=no` and `--debug=false` to mean the same as\n\t * `--no-debug`.\n\t *\n\t * @param {boolean|string} val\n\t *   a boolean, or a string naming a boolean value.\n\t * @return {boolean}\n\t */\n\tbooleanOption: function(val) {\n\t\tif (!val) { return false; }\n\t\tif ((typeof val) === 'string' && /^(no|false)$/i.test(val)) {\n\t\t\treturn false;\n\t\t}\n\t\treturn true;\n\t},\n\n\t/**\n\t * Set the color flags, based on an options object.\n\t *\n\t * @param {Object} options\n\t *   The options object to use for setting the mode of the 'color' package.\n\t * @param {string|boolean} options.color\n\t *   Whether to use color.  Passing 'auto' will enable color only if\n\t *   stdout is a TTY device.\n\t */\n\tsetColorFlags: function(options) {\n\t\tvar colors = require('colors');\n\t\tif (options.color === 'auto') {\n\t\t\tif (!process.stdout.isTTY) {\n\t\t\t\tcolors.mode = 'none';\n\t\t\t}\n\t\t} else if (!ScriptUtils.booleanOption(options.color)) {\n\t\t\tcolors.mode = 'none';\n\t\t}\n\t},\n\n\t/**\n\t * Add standard options to an yargs options hash.\n\t * This handles options parsed by `setDebuggingFlags`,\n\t * `setTemplatingAndProcessingFlags`, `setColorFlags`,\n\t * and standard --help options.\n\t *\n\t * The `defaults` option is optional, and lets you override\n\t * the defaults for the standard options.\n\t *\n\t * @param opts\n\t * @param defaults\n\t */\n\taddStandardOptions: function(opts, defaults) {\n\t\tvar standardOpts = {\n\t\t\t// standard CLI options\n\t\t\t'help': {\n\t\t\t\tdescription: 'Show this help message',\n\t\t\t\t'boolean': true,\n\t\t\t\t'default': false,\n\t\t\t\talias: 'h',\n\t\t\t},\n\t\t\t// handled by `setDebuggingFlags`\n\t\t\t'debug': {\n\t\t\t\tdescription: 'Provide optional flags. Use --debug=help for supported options',\n\t\t\t},\n\t\t\t'trace': {\n\t\t\t\tdescription: 'Use --trace=help for supported options',\n\t\t\t},\n\t\t\t'dump': {\n\t\t\t\tdescription: 'Dump state. Use --dump=help for supported options',\n\t\t\t},\n\t\t\t// handled by `setTemplatingAndProcessingFlags`\n\t\t\t'fetchConfig': {\n\t\t\t\tdescription: 'Whether to fetch the wiki config from the server or use our local copy',\n\t\t\t\t'boolean': true,\n\t\t\t\t'default': true,\n\t\t\t},\n\t\t\t'fetchTemplates': {\n\t\t\t\tdescription: 'Whether to fetch included templates recursively',\n\t\t\t\t'boolean': true,\n\t\t\t\t'default': true,\n\t\t\t},\n\t\t\t'fetchImageInfo': {\n\t\t\t\tdescription: 'Whether to fetch image info via the API',\n\t\t\t\t'boolean': true,\n\t\t\t\t'default': true,\n\t\t\t},\n\t\t\t'expandExtensions': {\n\t\t\t\tdescription: 'Whether we should request extension tag expansions from a wiki',\n\t\t\t\t'boolean': true,\n\t\t\t\t'default': true,\n\t\t\t},\n\t\t\t'usePHPPreProcessor': {\n\t\t\t\tdescription: 'Whether to use the PHP preprocessor to expand templates',\n\t\t\t\t'boolean': true,\n\t\t\t\t'default': true,\n\t\t\t},\n\t\t\t'addHTMLTemplateParameters': {\n\t\t\t\tdescription: 'Parse template parameters to HTML and add them to template data',\n\t\t\t\t'boolean': true,\n\t\t\t\t'default': false,\n\t\t\t},\n\t\t\t'maxdepth': {\n\t\t\t\tdescription: 'Maximum expansion depth',\n\t\t\t\t'default': 40,\n\t\t\t},\n\t\t\t'apiURL': {\n\t\t\t\tdescription: 'http path to remote API, e.g. http://en.wikipedia.org/w/api.php',\n\t\t\t\t'default': null,\n\t\t\t},\n\t\t\t'useBatchAPI': {\n\t\t\t\tdescription: 'Turn on/off the API batching system',\n\t\t\t\t'boolean': false,\n\t\t\t},\n\t\t\t// handled by `setColorFlags`\n\t\t\t'color': {\n\t\t\t\tdescription: 'Enable color output Ex: --no-color',\n\t\t\t\t'default': 'auto',\n\t\t\t},\n\t\t};\n\t\t// allow overriding defaults\n\t\tObject.keys(defaults || {}).forEach(function(name) {\n\t\t\tif (standardOpts[name]) {\n\t\t\t\tstandardOpts[name].default = defaults[name];\n\t\t\t}\n\t\t});\n\t\treturn Util.extendProps(opts, standardOpts);\n\t},\n};\n\n/**\n * Perform a HTTP request using the 'request' package, and retry on failures.\n * Only use on idempotent HTTP end points.\n *\n * @param {number} retries The number of retries to attempt.\n * @param {Object} requestOptions Request options.\n * @param {number} [delay] Exponential back-off.\n * @return {Promise}\n */\nScriptUtils.retryingHTTPRequest = function(retries, requestOptions, delay) {\n\tdelay = delay || 100;  // start with 100ms\n\treturn request(requestOptions)\n\t.catch(function(error) {\n\t\tif (retries--) {\n\t\t\tconsole.error('HTTP ' + requestOptions.method + ' to \\n' +\n\t\t\t\t\t(requestOptions.uri || requestOptions.url) + ' failed: ' + error +\n\t\t\t\t\t'\\nRetrying in ' + (delay / 1000) + ' seconds.');\n\t\t\treturn Promise.delay(delay).then(function() {\n\t\t\t\treturn ScriptUtils.retryingHTTPRequest(retries, requestOptions, delay * 2);\n\t\t\t});\n\t\t} else {\n\t\t\treturn Promise.reject(error);\n\t\t}\n\t})\n\t.spread(function(res, body) {\n\t\tif (res.statusCode !== 200) {\n\t\t\tthrow new Error('Got status code: ' + res.statusCode +\n\t\t\t\t'; body: ' + JSON.stringify(body || '').substr(0, 500));\n\t\t}\n\t\treturn Array.from(arguments);\n\t});\n};\n\nif (typeof module === \"object\") {\n\tmodule.exports.ScriptUtils = ScriptUtils;\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tools/build-langconv-fst.js","messages":[{"ruleId":"jsdoc/check-tag-names","severity":1,"message":"Invalid JSDoc tag name \"0@\".","line":14,"column":null,"nodeType":"Block","endLine":14,"endColumn":null},{"ruleId":"jsdoc/check-tag-names","severity":1,"message":"Invalid JSDoc tag name \"_IDENTITY_SYMBOL_@\".","line":18,"column":null,"nodeType":"Block","endLine":18,"endColumn":null},{"ruleId":"node/no-missing-require","severity":2,"message":"\"../lib/language/FST.js\" is not found.","line":84,"column":21,"nodeType":"Literal","endLine":84,"endColumn":45},{"ruleId":"no-shadow","severity":2,"message":"'fs' is already declared in the upper scope.","line":326,"column":6,"nodeType":"Identifier","messageId":"noShadow","endLine":326,"endColumn":8},{"ruleId":"no-shadow","severity":2,"message":"'out' is already declared in the upper scope.","line":363,"column":10,"nodeType":"Identifier","messageId":"noShadow","endLine":363,"endColumn":13}],"errorCount":3,"warningCount":2,"fixableErrorCount":0,"fixableWarningCount":0,"source":"#!/usr/bin/env node\n\n'use strict';\n\nrequire('../core-upgrade.js');\n\n/**\n * Compile an .att-format finite state transducer (as output by foma)\n * into a compact byte-array representation which is directly executable.\n * The input is expected to be a \"byte machine\", that is, unicode code units\n * have already been decomposed into code points corresponding to UTF-8\n * bytes.  Symbols used in the ATT file:\n *\n *  @0@      Epsilon (\"no character\").  Used in both input and output edges;\n *           as an input edge this introduced nondeterminism.\n *  <hh>    The input byte with hexadecimal value <hh>\n *             (\"00\" should never appear in the ATT file; see below.)\n *  @_IDENTITY_SYMBOL_@   Any character not named in the (implicit) alphabet\n *  [[       Bracket characters, used to delimit \"unsafe\" strings in\n *  ]]       \"bracket machines'.\n *\n * The output is a byte array.  We use a variable-length integer encoding:\n *   0xxx xxxy -> the directly-encoded value (xxx xxxx)\n *   1xxx xxxx -> (xxx xxxx) + 128 * ( 1 + <the value encoded by subsequent bytes>)\n * For signed quantities, the least significant digit is used for a sign\n * bit.  That is, to encode first:\n *   from_signed(x) = (x >= 0) ? (2*x) : (-2*(x + 1) + 1);\n * and when decoding:\n *   to_signed(x) = (x & 1) ? (((x-1)/-2)-1) : (x/2);\n * See [en:Variable-length_quantity#Zigzag_encoding] for details.\n *\n * Byte value 0x00 is used for \"epsilon\" edges.  Null characters are\n *  disallowed in wikitext, and foma would have trouble handling them\n *  natively since it is written in C with null-terminated strings.\n *  As an input character this represents a non-deterministic transition;\n *  as an output character it represents \"no output\".\n *  If you wanted (for some reason) to allow null characters in the\n *  input (which are not included in the \"anything else\" case), then\n *  encode them as 0xC0 0x80 (aka \"Modified UTF-8\").  [Similarly, if\n *  you wanted to emit a null character, you could emit 0xC0 0x80,\n *  although emitting 0x00 directly ought to work fine as well.]\n *\n * Byte values 0xF8 - 0xFF are disallowed in UTF-8.  We use them for\n * special cases, as follows:\n *  0xFF: EOF (the end of the input string).  Final states in the machine\n *   are represented with an inchar=0xFF outchar=0x00 transition to a\n *   unique \"stop state\" (aka state #0).  Non-final states have no outgoing\n *   edge for input 0xFF.\n *  0xFE: IDENTITY.  As an output character it copies the input character.\n *  0xFD: ]]\n *  0xFC: [[  These bracketing characters should only appear as output\n *   characters; they will never appear in the input.\n *\n * The byte array begins with eight \"magic bytes\" to help identify the\n * file format.\n *\n * Following this, we have an array of states.  State #0 is the unique\n * \"final state\"; state #1 is the unique start state.  Each state is:\n *   <# of bytes in each edge: variable unsigned int>\n *   <# edges: variable unsigned int>\n *   <edge #0>\n *   <edge #1>\n *   etc\n * Each edge is:\n *   <in byte: 1 byte>\n *   <out byte: 1 byte>\n *   <target state: variable signed int>\n *   <padding, if necessary to reach proper # of bytes in each edge>\n *\n * Edges are sorted by <in byte> to allow binary search. All target\n * states are relative, refer to the start position of that state in\n * the byte array, and are padded to the same size within each state.\n * If the first edge(s) have <in byte> = 0x00 then these edges\n * represent possible epsilon transitions from this state (aka, these\n * edge should be tried if subsequent execution from this state\n * fails).\n */\n\nconst fs = require('fs');\nconst path = require('path');\nconst yargs = require('yargs');\nconst { StringDecoder } = require('string_decoder');\n\nconst FST = require('../lib/language/FST.js');\n\nconst BYTE_IDENTITY = FST.constants.BYTE_IDENTITY;\nconst BYTE_RBRACKET = FST.constants.BYTE_RBRACKET;\nconst BYTE_LBRACKET = FST.constants.BYTE_LBRACKET;\nconst BYTE_FAIL     = FST.constants.BYTE_FAIL;\nconst BYTE_EOF      = FST.constants.BYTE_EOF;\nconst BYTE_EPSILON  = FST.constants.BYTE_EPSILON;\n\nclass DefaultMap extends Map {\n\tconstructor(makeDefaultValue) {\n\t\tsuper();\n\t\tthis.makeDefaultValue = makeDefaultValue;\n\t}\n\tgetDefault(key) {\n\t\tif (!this.has(key)) {\n\t\t\tthis.set(key, this.makeDefaultValue());\n\t\t}\n\t\treturn this.get(key);\n\t}\n}\n\n// Splits input on `\\r\\n?|\\n` without holding entire file in memory at once.\nfunction *readLines(inFile) {\n\tconst fd = fs.openSync(inFile, 'r');\n\ttry {\n\t\tconst buf = Buffer.alloc(1024);\n\t\tconst decoder = new StringDecoder('utf8');\n\t\tlet line = '';\n\t\tlet sawCR = false;\n\t\twhile (true) {\n\t\t\tconst bytesRead = fs.readSync(fd, buf, 0, buf.length);\n\t\t\tif (bytesRead === 0) { break; }\n\t\t\tlet lineStart = 0;\n\t\t\tfor (let i = 0; i < bytesRead; i++) {\n\t\t\t\tif (buf[i] === 13 || buf[i] === 10) {\n\t\t\t\t\tline += decoder.write(buf.slice(lineStart, i));\n\t\t\t\t\tif (!(buf[i] === 10 && sawCR)) {\n\t\t\t\t\t\t// skip over the zero-length \"lines\" caused by \\r\\n\n\t\t\t\t\t\tyield line;\n\t\t\t\t\t}\n\t\t\t\t\tline = '';\n\t\t\t\t\tlineStart = i + 1;\n\t\t\t\t\tsawCR = (buf[i] === 13);\n\t\t\t\t} else {\n\t\t\t\t\tsawCR = false;\n\t\t\t\t}\n\t\t\t}\n\t\t\tline += decoder.write(buf.slice(lineStart, bytesRead));\n\t\t}\n\t\tline += decoder.end();\n\t\tyield line;\n\t} finally {\n\t\tfs.closeSync(fd);\n\t}\n}\n\nfunction readAttFile(inFile, handleState, handleFinal) {\n\tlet lastState = 0;\n\tlet edges = [];\n\tconst finalStates = [];\n\tfor (const line of readLines(inFile)) {\n\t\tif (line.length === 0) { continue; }\n\t\tconst fields = line.split(/\\t/g);\n\t\tconst state = +fields[0];\n\t\tif (fields.length === 1 || state !== lastState) {\n\t\t\tif (lastState >= 0) {\n\t\t\t\thandleState(lastState, edges);\n\t\t\t\tedges = [];\n\t\t\t\tlastState = -1;\n\t\t\t}\n\t\t}\n\t\tif (fields.length === 1) {\n\t\t\tfinalStates.push(state);\n\t\t} else {\n\t\t\tconsole.assert(fields.length === 4);\n\t\t\tconst to = +fields[1];\n\t\t\tconst inChar = fields[2];\n\t\t\tconst outChar = fields[3];\n\t\t\tedges.push({ to, inChar, outChar });\n\t\t\tlastState = state;\n\t\t}\n\t}\n\tif (lastState >= 0) {\n\t\thandleState(lastState, edges);\n\t}\n\tif (handleFinal) {\n\t\thandleFinal(finalStates);\n\t}\n}\n\nclass DynamicBuffer {\n\tconstructor(chunkLength) {\n\t\tthis.chunkLength = chunkLength || 16384;\n\t\tthis.currBuff = Buffer.alloc(this.chunkLength);\n\t\tthis.buffNum = 0;\n\t\tthis.offset = 0;\n\t\tthis.buffers = [ this.currBuff ];\n\t\tthis.lastLength = 0;\n\t}\n\temit(b) {\n\t\tconsole.assert(b !== undefined);\n\t\tif (this.offset >= this.currBuff.length) {\n\t\t\tthis.buffNum++; this.offset = 0;\n\t\t\tthis._maybeCreateBuffers();\n\t\t\tthis.currBuff = this.buffers[this.buffNum];\n\t\t}\n\t\tthis.currBuff[this.offset++] = b;\n\t\tthis._maybeUpdateLength();\n\t}\n\temitUnsignedV(val, pad) {\n\t\tconst o = [];\n\t\t/* eslint-disable no-bitwise */\n\t\to.push(val & 127);\n\t\tfor (val >>>= 7; val; val >>>= 7) {\n\t\t\to.push(128 | (--val & 127));\n\t\t}\n\t\t/* eslint-enable no-bitwise */\n\t\tfor (let j = o.length - 1; j >= 0; j--) {\n\t\t\tthis.emit(o[j]);\n\t\t}\n\t\tif (pad !== undefined) {\n\t\t\tfor (let j = o.length; j < pad; j++) {\n\t\t\t\tthis.emit(0 /* padding */);\n\t\t\t}\n\t\t}\n\t}\n\temitSignedV(val, pad) {\n\t\tif (val >= 0) {\n\t\t\tval *= 2;\n\t\t} else {\n\t\t\tval = (-val) * 2 - 1;\n\t\t}\n\t\tthis.emitUnsignedV(val, pad);\n\t}\n\tposition() {\n\t\treturn this.offset + this.buffNum * this.chunkLength;\n\t}\n\tlength() {\n\t\treturn this.lastLength + (this.buffers.length - 1) * this.chunkLength;\n\t}\n\ttruncate() {\n\t\tthis.lastLength = this.offset;\n\t\tthis.buffers.length = this.buffNum + 1;\n\t}\n\t_maybeCreateBuffers() {\n\t\twhile (this.buffNum >= this.buffers.length) {\n\t\t\tthis.buffers.push(Buffer.alloc(this.chunkLength));\n\t\t\tthis.lastLength = 0;\n\t\t}\n\t}\n\t_maybeUpdateLength() {\n\t\tif (\n\t\t\tthis.offset > this.lastLength &&\n\t\t\tthis.buffNum === this.buffers.length - 1\n\t\t) {\n\t\t\tthis.lastLength = this.offset;\n\t\t}\n\t}\n\tseek(pos) {\n\t\tconsole.assert(pos !== undefined);\n\t\tthis.buffNum = Math.floor(pos / this.chunkLength);\n\t\tthis.offset = pos - (this.buffNum * this.chunkLength);\n\t\tthis._maybeCreateBuffers();\n\t\tthis.currBuff = this.buffers[this.buffNum];\n\t\tthis._maybeUpdateLength();\n\t}\n\tread() {\n\t\tif (this.offset >= this.currBuff.length) {\n\t\t\tthis.buffNum++; this.offset = 0;\n\t\t\tthis._maybeCreateBuffers();\n\t\t\tthis.currBuff = this.buffers[this.buffNum];\n\t\t}\n\t\tconst b = this.currBuff[this.offset++];\n\t\tthis._maybeUpdateLength();\n\t\treturn b;\n\t}\n\treadUnsignedV() {\n\t\tlet b = this.read();\n\t\t/* eslint-disable no-bitwise */\n\t\tlet val = b & 127;\n\t\twhile (b & 128) {\n\t\t\tval += 1;\n\t\t\tb = this.read();\n\t\t\tval = (val << 7) + (b & 127);\n\t\t}\n\t\t/* eslint-enable no-bitwise */\n\t\treturn val;\n\t}\n\treadSignedV() {\n\t\tconst v = this.readUnsignedV();\n\t\t/* eslint-disable no-bitwise */\n\t\tif (v & 1) {\n\t\t\treturn -(v >>> 1) - 1;\n\t\t} else {\n\t\t\treturn (v >>> 1);\n\t\t}\n\t\t/* eslint-enable no-bitwise */\n\t}\n\twriteFile(outFile) {\n\t\tconst fd = fs.openSync(outFile, 'w');\n\t\ttry {\n\t\t\tlet i;\n\t\t\tfor (i = 0; i < this.buffers.length - 1; i++) {\n\t\t\t\tfs.writeSync(fd, this.buffers[i]);\n\t\t\t}\n\t\t\tfs.writeSync(fd, this.buffers[i], 0, this.lastLength);\n\t\t} finally {\n\t\t\tfs.closeSync(fd);\n\t\t}\n\t}\n}\n\nfunction processOne(inFile, outFile, verbose, justBrackets, maxEdgeBytes) {\n\tif (justBrackets === undefined) {\n\t\tjustBrackets = /\\bbrack-/.test(inFile);\n\t}\n\tif (maxEdgeBytes === undefined) {\n\t\tmaxEdgeBytes = 10;\n\t}\n\n\tlet finalStates;\n\tconst alphabet = new Set();\n\tconst sym2byte = function(sym) {\n\t\tif (sym === '@_IDENTITY_SYMBOL_@') { return BYTE_IDENTITY; }\n\t\tif (sym === '@0@') { return BYTE_EPSILON; }\n\t\tif (sym === '[[') { return BYTE_LBRACKET; }\n\t\tif (sym === ']]') { return BYTE_RBRACKET; }\n\t\tif (/^[0-9A-F][0-9A-F]$/i.test(sym)) {\n\t\t\tconst b = Number.parseInt(sym, 16);\n\t\t\tconsole.assert(b !== 0 && b < 0xF8);\n\t\t\treturn b;\n\t\t}\n\t\tconsole.assert(false, `Bad symbol: ${sym}`);\n\t};\n\t// Quickly read through once in order to pull out the set of final states\n\t// and the alphabet\n\treadAttFile(inFile, (state, edges) => {\n\t\tfor (const e of edges) {\n\t\t\talphabet.add(sym2byte(e.inChar));\n\t\t\talphabet.add(sym2byte(e.outChar));\n\t\t}\n\t}, (fs) => {\n\t\tfinalStates = new Set(fs);\n\t});\n\t// Anything not in `alphabet` is going to be treated as 'anything else'\n\t// but we want to force 0x00 and 0xF8-0xFF to be treated as 'anything else'\n\talphabet.delete(0);\n\tfor (let i = 0xF8; i <= 0xFF; i++) { alphabet.delete(i); }\n\t// Emit a magic number.\n\tconst out = new DynamicBuffer();\n\tout.emit(0x70); out.emit(0x46); out.emit(0x53); out.emit(0x54);\n\tout.emit(0x00); out.emit(0x57); out.emit(0x4D); out.emit(0x00);\n\t// Ok, now read through and build the output array\n\tlet synState = -1;\n\tconst stateMap = new Map();\n\t// Reserve the EOF state (0 in output)\n\tstateMap.set(synState--, out.position());\n\tout.emitUnsignedV(0);\n\tout.emitUnsignedV(0);\n\tconst processState = (state, edges) => {\n\t\tconsole.assert(!stateMap.has(state));\n\t\tstateMap.set(state, out.position());\n\t\tout.emitUnsignedV(maxEdgeBytes);\n\t\t// First emit epsilon edges\n\t\tconst r = edges.filter(e => e.inByte === BYTE_EPSILON);\n\t\t// Then emit a sorted table of inByte transitions, omitting repeated\n\t\t// entries (so it's a range map)\n\t\t// Note that BYTE_EOF is always either FAIL or a transition to a unique\n\t\t// state, so we can always treat values lower than the first entry\n\t\t// or higher than the last entry as FAIL.\n\t\tconst edgeMap = new Map(edges.map(e => [e.inByte, e]));\n\t\tlet lastEdge = { outByte: BYTE_FAIL, to: state };\n\t\tfor (let i = 1; i <= BYTE_EOF; i++) {\n\t\t\tlet e = (alphabet.has(i) || i === BYTE_EOF) ?\n\t\t\t\tedgeMap.get(i) : edgeMap.get(BYTE_IDENTITY);\n\t\t\tif (!e) { e = { outByte: BYTE_FAIL, to: state }; }\n\t\t\t// where possible remap outByte to IDENTITY to maximize chances\n\t\t\t// of adjacent states matching\n\t\t\tconst out = (i === e.outByte) ? BYTE_IDENTITY : e.outByte;\n\t\t\tif (out !== lastEdge.outByte || e.to !== lastEdge.to) {\n\t\t\t\tlastEdge = { inByte: i, outByte: out, to: e.to };\n\t\t\t\tr.push(lastEdge);\n\t\t\t}\n\t\t}\n\t\tout.emitUnsignedV(r.length);\n\t\tr.forEach((e) => {\n\t\t\tout.emit(e.inByte);\n\t\t\tout.emit(e.outByte);\n\t\t\tout.emitSignedV(e.to, maxEdgeBytes - 2 /* for inByte/outByte */);\n\t\t});\n\t};\n\treadAttFile(inFile, (state, edges) => {\n\t\t// Map characters to bytes\n\t\tedges = edges.map((e) => {\n\t\t\treturn {\n\t\t\t\tto: e.to,\n\t\t\t\tinByte: sym2byte(e.inChar),\n\t\t\t\toutByte: sym2byte(e.outChar),\n\t\t\t};\n\t\t});\n\t\t// If this is a final state, add a synthetic EOF edge\n\t\tif (finalStates.has(state)) {\n\t\t\tedges.push({ to: -1, inByte: BYTE_EOF, outByte: BYTE_EPSILON });\n\t\t}\n\t\t// Collect edges and figure out if we need to split the state\n\t\t// (if there are multiple edges with the same non-epsilon inByte).\n\t\tconst edgeMap = new DefaultMap(() => []);\n\t\tfor (const e of edges) {\n\t\t\tedgeMap.getDefault(e.inByte).push(e);\n\t\t}\n\t\t// For each inByte with multiple outgoing edges, replace those\n\t\t// edges with a single edge:\n\t\t//  { to: newState, inChar: e.inByte, outChar: BYTE_EPSILON }\n\t\t// ...and then create a new state with edges:\n\t\t//  [{ to: e[n].to, inChar: BYTE_EPSILON, outChar: e[n].outChar},...]\n\t\tconst extraStates = [];\n\t\tfor (const [inByte, e] of edgeMap.entries()) {\n\t\t\tif (inByte !== BYTE_EPSILON && e.length > 1) {\n\t\t\t\tconst nstate = synState--;\n\t\t\t\textraStates.push({\n\t\t\t\t\tstate: nstate,\n\t\t\t\t\tedges: e.map((ee) => {\n\t\t\t\t\t\treturn {\n\t\t\t\t\t\t\tto: ee.to,\n\t\t\t\t\t\t\tinByte: BYTE_EPSILON,\n\t\t\t\t\t\t\toutByte: ee.outByte,\n\t\t\t\t\t\t};\n\t\t\t\t\t}),\n\t\t\t\t});\n\t\t\t\tedgeMap.set(inByte, [{\n\t\t\t\t\tto: nstate,\n\t\t\t\t\tinByte: inByte,\n\t\t\t\t\toutByte: BYTE_EPSILON\n\t\t\t\t}]);\n\t\t\t}\n\t\t}\n\t\tprocessState(state, [].concat.apply([], Array.from(edgeMap.values())));\n\t\textraStates.forEach((extra) => {\n\t\t\tprocessState(extra.state, extra.edges);\n\t\t});\n\t});\n\t// Rarely a state will not be mentioned in the .att file except\n\t// in the list of final states; check this & process at the end.\n\tfinalStates.forEach((state) => {\n\t\tif (!stateMap.has(state)) {\n\t\t\tprocessState(state, [\n\t\t\t\t{ to: -1, inByte: BYTE_EOF, outByte: BYTE_EPSILON }\n\t\t\t]);\n\t\t}\n\t});\n\t// Fixup buffer to include relative offsets to states\n\tconst state0pos = stateMap.get(-1);\n\tout.seek(state0pos);\n\twhile (out.position() < out.length()) {\n\t\tconst edgeWidth = out.readUnsignedV();\n\t\tconst nEdges = out.readUnsignedV();\n\t\tconst edge0 = out.position();\n\t\tfor (let i = 0; i < nEdges; i++) {\n\t\t\tconst p = edge0 + i * edgeWidth + /* inByte/outByte: */ 2;\n\t\t\tout.seek(p);\n\t\t\tconst state = out.readSignedV();\n\t\t\tout.seek(p);\n\t\t\tconsole.assert(stateMap.has(state), `${state} not found`);\n\t\t\tout.emitSignedV(stateMap.get(state) - p, edgeWidth - 2);\n\t\t}\n\t\tout.seek(edge0 + nEdges * edgeWidth);\n\t}\n\t// Now iteratively narrow the field widths until the file is as small\n\t// as it can be.\n\twhile (true) {\n\t\tlet trimmed = 0;\n\t\tstateMap.clear();\n\t\tconst widthMap = new Map();\n\t\tout.seek(state0pos);\n\t\twhile (out.position() < out.length()) {\n\t\t\tconst statePos = out.position();\n\t\t\tstateMap.set(statePos, statePos - trimmed);\n\t\t\tconst edgeWidth = out.readUnsignedV();\n\t\t\tconst widthPos = out.position();\n\t\t\tconst nEdges = out.readUnsignedV();\n\t\t\tlet maxWidth = 0;\n\t\t\tconst edge0 = out.position();\n\t\t\tfor (let i = 0; i < nEdges; i++) {\n\t\t\t\tconst p = edge0 + i * edgeWidth;\n\t\t\t\tout.seek(p);\n\t\t\t\tout.read(); out.read(); out.readSignedV();\n\t\t\t\tconst thisWidth = out.position() - p;\n\t\t\t\tmaxWidth = Math.max(maxWidth, thisWidth);\n\t\t\t}\n\t\t\twidthMap.set(statePos, maxWidth);\n\t\t\ttrimmed += (edgeWidth - maxWidth) * nEdges;\n\t\t\tif (maxWidth !== edgeWidth) {\n\t\t\t\tout.seek(statePos);\n\t\t\t\tout.emitUnsignedV(maxWidth);\n\t\t\t\ttrimmed += (out.position() - widthPos);\n\t\t\t\tout.seek(statePos);\n\t\t\t\tout.emitUnsignedV(edgeWidth);\n\t\t\t}\n\t\t\tout.seek(edge0 + nEdges * edgeWidth);\n\t\t}\n\t\tstateMap.set(out.position(), out.position() - trimmed);\n\n\t\tif (trimmed === 0) { break; /* nothing left to do */ }\n\t\tif (verbose) { console.log('.'); }\n\n\t\tout.seek(state0pos);\n\t\twhile (out.position() < out.length()) {\n\t\t\tconst statePos = out.position();\n\t\t\tconsole.assert(stateMap.has(statePos) && widthMap.has(statePos));\n\t\t\tconst nWidth = widthMap.get(statePos);\n\n\t\t\tconst oldWidth = out.readUnsignedV();\n\t\t\tconst nEdges = out.readUnsignedV();\n\t\t\tconst edge0 = out.position();\n\n\t\t\tlet nPos = stateMap.get(statePos);\n\t\t\tout.seek(nPos);\n\t\t\tout.emitUnsignedV(nWidth);\n\t\t\tout.emitUnsignedV(nEdges);\n\t\t\tnPos = out.position();\n\n\t\t\tfor (let i = 0; i < nEdges; i++) {\n\t\t\t\tout.seek(edge0 + i * oldWidth);\n\t\t\t\tconst inByte = out.read();\n\t\t\t\tconst outByte = out.read();\n\t\t\t\tlet toPos = out.position();\n\t\t\t\ttoPos += out.readSignedV();\n\t\t\t\tconsole.assert(stateMap.has(toPos), toPos);\n\t\t\t\ttoPos = stateMap.get(toPos);\n\n\t\t\t\tout.seek(nPos);\n\t\t\t\tout.emit(inByte);\n\t\t\t\tout.emit(outByte);\n\t\t\t\ttoPos -= out.position();\n\t\t\t\tout.emitSignedV(toPos, nWidth - 2);\n\t\t\t\tnPos = out.position();\n\t\t\t}\n\t\t\tout.seek(edge0 + nEdges * oldWidth);\n\t\t}\n\t\tout.seek(stateMap.get(out.position()));\n\t\tout.truncate();\n\t}\n\n\t// Done!\n\tout.writeFile(outFile);\n}\n\nfunction main() {\n\tconst yopts = yargs\n\t.usage(\n\t\t'Usage: $0 [options] <conversion> <inverse>\\n' +\n\t\t'Converts a finite-state transducer in .att format.'\n\t)\n\t.options({\n\t\t'output': {\n\t\t\tdescription: 'Output filename (or base name)',\n\t\t\talias: 'o',\n\t\t\tnargs: 1,\n\t\t\tnormalize: true,\n\t\t},\n\t\t'file': {\n\t\t\tdescription: 'Input .att filename',\n\t\t\talias: 'f',\n\t\t\tconflicts: 'language',\n\t\t\timplies: 'output',\n\t\t\tnargs: 1,\n\t\t\tnormalize: true,\n\t\t},\n\t\t'language': {\n\t\t\tdescription: 'Converts trans-{conversion}, brack-{conversion}-noop, and brack-{conversion}-{inverse} in default locations',\n\t\t\talias: 'l',\n\t\t\tconflicts: 'file',\n\t\t\tarray: true,\n\t\t},\n\t\t'brackets': {\n\t\t\tdescription: 'Emit a bracket-location machine',\n\t\t\talias: 'b',\n\t\t\tboolean: true,\n\t\t\tdefault: undefined,\n\t\t},\n\t\t'verbose': {\n\t\t\tdescription: 'Show progress',\n\t\t\talias: 'v',\n\t\t\tboolean: true,\n\t\t},\n\t})\n\t.example('$0 -l sr-ec sr-el');\n\n\tconst argv = yopts.argv;\n\tif (argv.help) {\n\t\tyopts.showHelp();\n\t\treturn;\n\t}\n\n\tif (argv.file) {\n\t\tprocessOne(argv.file, argv.output, argv.brackets);\n\t} else if (argv.language) {\n\t\tconst convertLang = argv.language[0];\n\t\tconst inverseLangs = argv.language.slice(1);\n\t\tconst baseDir = path.join(__dirname, '..', 'lib', 'language', 'fst');\n\t\tfor (const f of [\n\t\t\t`trans-${convertLang}`,\n\t\t\t`brack-${convertLang}-noop`,\n\t\t].concat(inverseLangs.map(inv => `brack-${convertLang}-${inv}`))) {\n\t\t\tif (argv.verbose) {\n\t\t\t\tconsole.log(f);\n\t\t\t}\n\t\t\tprocessOne(\n\t\t\t\tpath.join(baseDir, `${f}.att`),\n\t\t\t\tpath.join(baseDir, `${f}.pfst`),\n\t\t\t\targv.verbose\n\t\t\t);\n\t\t}\n\t} else {\n\t\tyopts.showHelp();\n\t}\n}\n\nif (require.main === module) {\n\tmain();\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tools/compare.linter.results.js","messages":[{"ruleId":"no-process-exit","severity":2,"message":"Don't use process.exit(); throw an error instead.","line":101,"column":2,"nodeType":"CallExpression","messageId":"noProcessExit","endLine":101,"endColumn":17},{"ruleId":"no-process-exit","severity":2,"message":"Don't use process.exit(); throw an error instead.","line":108,"column":2,"nodeType":"CallExpression","messageId":"noProcessExit","endLine":108,"endColumn":17},{"ruleId":"no-process-exit","severity":2,"message":"Don't use process.exit(); throw an error instead.","line":114,"column":2,"nodeType":"CallExpression","messageId":"noProcessExit","endLine":114,"endColumn":17}],"errorCount":3,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"source":"#!/usr/bin/env node\n\n'use strict';\n\n/* Fetch new results from https://tools.wmflabs.org/wikitext-deprecation/api\n * and save them locally and run this script to compare how things have changed\n * for a particular category */\n\n// ------------------ Console printer ------------------\nfunction pad(n, len) {\n\tif (!len) {\n\t\tlen = 15;\n\t}\n\treturn String(n).padStart(len);\n}\n\nfunction ConsolePrinter() {}\n\nConsolePrinter.printSectionHeader = function(heading) {\n\tconsole.log(heading);\n};\n\nConsolePrinter.printTableHeader = function(columns) {\n\tconsole.log(\"-\".repeat(80));\n\tconsole.log(columns.map(function(c) { return pad(c); }).join('\\t'));\n\tconsole.log(\"-\".repeat(80));\n};\n\nConsolePrinter.printTableRow = function(columns) {\n\tconsole.log(columns.map(function(c) { return pad(c); }).join('\\t'));\n};\n\nConsolePrinter.printTableFooter = function() {\n\tconsole.log(\"-\".repeat(80));\n\tconsole.log(\"\\n\");\n};\n\n// ------------------ Wikitxt printer ------------------\nfunction WikitextPrinter() {}\n\nWikitextPrinter.printSectionHeader = function(heading) {\n\tconsole.log('==' + heading + '==');\n};\n\nWikitextPrinter.printTableHeader = function(columns) {\n\tconsole.log('{| class=\"wikitable sortable\" style=\"width:60%\"');\n\tconsole.log('|-');\n\tconsole.log('!' + columns.join('!!'));\n};\n\nWikitextPrinter.printTableRow = function(columns) {\n\tconsole.log('|-');\n\tconsole.log('|' + columns.join('||'));\n};\n\nWikitextPrinter.printTableFooter = function() {\n\tconsole.log('|}\\n');\n};\n\n// ------------------------------------------------------\nrequire('../core-upgrade.js');\nvar path = require('path');\nvar yargs = require('yargs');\n\nvar opts = yargs\n.usage(\"Usage $0 [options] old-json-file new-json-file\")\n.options({\n\thelp: {\n\t\tdescription: 'Show this message',\n\t\t'boolean': true,\n\t\t'default': false,\n\t},\n\twikify: {\n\t\tdescription: 'Emit report in wikitext format for a wiki',\n\t\t'boolean': true,\n\t\t'default': false,\n\t},\n\tbaseline_count: {\n\t\tdescription: 'Baseline count for determinining remex-readiness',\n\t\t'boolean': false,\n\t\t'default': 25,\n\t},\n});\n\nvar highPriorityCats = [\n\t\"deletable-table-tag\",\n\t\"pwrap-bug-workaround\",\n\t\"self-closed-tag\",\n\t\"tidy-whitespace-bug\",\n\t\"html5-misnesting\",\n\t\"tidy-font-bug\",\n\t\"multiline-html-table-in-list\",\n\t\"multiple-unclosed-formatting-tags\",\n\t\"unclosed-quotes-in-heading\",\n];\n\nvar argv = opts.argv;\nvar numArgs = argv._.length;\nif (numArgs < 2) {\n\topts.showHelp();\n\tprocess.exit(1);\n}\n\nvar oldResults = require(path.resolve(process.cwd(), argv._[0]));\nvar wikis = Object.keys(oldResults);\nif (wikis.length === 0) {\n\tconsole.log(\"Old results from \" + argv._[0] + \" seems empty?\");\n\tprocess.exit(1);\n}\n\nvar newResults = require(path.resolve(process.cwd(), argv._[1]));\nif (Object.keys(newResults).length === 0) {\n\tconsole.log(\"New results from \" + argv._[1] + \" seems empty?\");\n\tprocess.exit(1);\n}\n\nvar printer = argv.wikify ? WikitextPrinter : ConsolePrinter;\n\nfunction printStatsForCategory(cat, p) {\n\tvar changes = wikis.reduce(function(accum, w) {\n\t\t// Skip wikis that don't have results for both wikis\n\t\tif (!newResults[w]) {\n\t\t\treturn accum;\n\t\t}\n\t\t// Record changes\n\t\tvar o = oldResults[w].linter_info[cat];\n\t\tvar n = newResults[w].linter_info[cat];\n\t\tif (n !== o) {\n\t\t\taccum.push({\n\t\t\t\twiki: w,\n\t\t\t\told: o,\n\t\t\t\tnew: n,\n\t\t\t\tchange: n - o,\n\t\t\t\tpercentage: o > 0 ? Math.round((n - o) / o * 1000) / 10 : 0,\n\t\t\t});\n\t\t}\n\t\treturn accum;\n\t}, []);\n\n\t// Most improved wikis first\n\tchanges.sort(function(a, b) {\n\t\treturn a.change > b.change ? 1 : (a.change < b.change ? -1 : 0);\n\t});\n\n\tp.printSectionHeader(\"Changes in \" + cat + \" counts for wikis\");\n\tp.printTableHeader([\"WIKI\", \"OLD\", \"NEW\", \"CHANGE\", \"PERCENTAGE\"]);\n\tfor (var i = 0; i < changes.length; i++) {\n\t\tvar d = changes[i];\n\t\tp.printTableRow([d.wiki, d.old, d.new, d.change, d.percentage]);\n\t}\n\tp.printTableFooter();\n}\n\n// Dump stats for each high-priority category\nhighPriorityCats.forEach(function(cat) {\n\tprintStatsForCategory(cat, printer);\n});\n\n// If count is below this threshold for all high priority categories,\n// we deem those wikis remex-ready. For now, hard-coded to zero, but\n// could potentially rely on a CLI option.\nvar maxCountPerHighPriorityCategory = parseInt(argv.baseline_count, 10);\nvar remexReadyWikis = [];\nwikis.forEach(function(w) {\n\tif (!newResults[w]) {\n\t\treturn;\n\t}\n\n\t// Check if this wiki is remex-ready\n\tvar remexReady = highPriorityCats.every(function(c) {\n\t\treturn newResults[w].linter_info[c] <= maxCountPerHighPriorityCategory;\n\t});\n\tif (remexReady) {\n\t\tremexReadyWikis.push({\n\t\t\tname: w,\n\t\t\tchanged: highPriorityCats.some(function(c) {\n\t\t\t\treturn oldResults[w].linter_info[c] > maxCountPerHighPriorityCategory;\n\t\t\t}),\n\t\t});\n\t}\n});\n\nif (remexReadyWikis.length > 0) {\n\tconsole.log('\\n');\n\tprinter.printSectionHeader('Wikis with < ' + argv.baseline_count + ' errors in all high priority categories');\n\tprinter.printTableHeader(['New', 'Changed?']);\n\tfor (var i = 0; i < remexReadyWikis.length; i++) {\n\t\tprinter.printTableRow([remexReadyWikis[i].name, remexReadyWikis[i].changed]);\n\t}\n\tprinter.printTableFooter();\n}\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tools/fetch-parserTests.txt.js","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tools/fetch-revision-data.js","messages":[{"ruleId":"node/no-missing-require","severity":2,"message":"\"../lib/mw/ApiRequest.js\" is not found.","line":20,"column":31,"nodeType":"Literal","endLine":20,"endColumn":56},{"ruleId":"node/no-missing-require","severity":2,"message":"\"../lib/config/MWParserEnvironment.js\" is not found.","line":22,"column":35,"nodeType":"Literal","endLine":22,"endColumn":73}],"errorCount":2,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"source":"#!/usr/bin/env node\n\n'use strict';\n\nrequire('../core-upgrade.js');\n\n/*\n * Given a title (and an optional revision id), fetch:\n * 1. the wikitext for a page from the MW API\n * 2. latest matching HTML and data-parsoid for the revision from RESTBase\n */\n\nvar fs = require('pn/fs');\nvar path = require('path');\nvar yargs = require('yargs');\nvar yaml = require('js-yaml');\n\nvar Promise = require('../lib/utils/promise.js');\n\nvar TemplateRequest = require('../lib/mw/ApiRequest.js').TemplateRequest;\nvar ParsoidConfig = require('../lib/config/ParsoidConfig.js').ParsoidConfig;\nvar MWParserEnvironment = require('../lib/config/MWParserEnvironment.js').MWParserEnvironment;\nvar Util = require('../lib/utils/Util.js').Util;\nvar ScriptUtils = require('./ScriptUtils.js').ScriptUtils;\n\nvar fetch = Promise.async(function *(page, revid, opts) {\n\tvar prefix = opts.prefix || null;\n\tvar domain = opts.domain || null;\n\tif (!prefix && !domain) {\n\t\tdomain = \"en.wikipedia.org\";\n\t}\n\n\tvar parsoidOptions = {};\n\n\tif (ScriptUtils.booleanOption(opts.config)) {\n\t\tvar p = (typeof (opts.config) === 'string') ?\n\t\t\tpath.resolve('.', opts.config) :\n\t\t\tpath.resolve(__dirname, '../config.yaml');\n\t\t// Assuming Parsoid is the first service in the list\n\t\tparsoidOptions = yaml.load(yield fs.readFile(p, 'utf8')).services[0].conf;\n\t}\n\n\tScriptUtils.setTemplatingAndProcessingFlags(parsoidOptions, opts);\n\tScriptUtils.setDebuggingFlags(parsoidOptions, opts);\n\n\tif (parsoidOptions.localsettings) {\n\t\tparsoidOptions.localsettings = path.resolve(__dirname, parsoidOptions.localsettings);\n\t}\n\n\tvar pc = new ParsoidConfig(null, parsoidOptions);\n\tif (!prefix) {\n\t\t// domain has been provided\n\t\tprefix = pc.getPrefixFor(domain);\n\t} else if (!domain) {\n\t\t// prefix has been set\n\t\tdomain = pc.mwApiMap.get(prefix).domain;\n\t}\n\tpc.defaultWiki = prefix;\n\n\tvar outputPrefix = prefix + \".\" + Util.phpURLEncode(page);\n\tvar rbOpts = {\n\t\turi: null,\n\t\tmethod: 'GET',\n\t\theaders: {\n\t\t\t'User-Agent': pc.userAgent,\n\t\t},\n\t};\n\n\tvar env = yield MWParserEnvironment.getParserEnv(pc, {\n\t\tprefix: prefix,\n\t\tdomain: domain,\n\t\tpageName: page,\n\t});\n\n\t// Fetch wikitext from mediawiki API\n\tvar target = page ?\n\t\tenv.normalizeAndResolvePageTitle() : null;\n\tyield TemplateRequest.setPageSrcInfo(env, target, revid);\n\tyield fs.writeFile(outputPrefix + \".wt\", env.page.src, 'utf8');\n\n\t// Fetch HTML from RESTBase\n\trbOpts.uri = \"https://\" + domain + \"/api/rest_v1/page/html/\" + Util.phpURLEncode(page) + (revid ? \"/\" + revid : \"\");\n\tvar resp = yield ScriptUtils.retryingHTTPRequest(2, rbOpts);\n\tyield fs.writeFile(outputPrefix + \".html\", resp[1], 'utf8');\n\tvar etag = resp[0].headers.etag.replace(/^W\\//, '').replace(/\"/g, '');\n\n\t// Fetch matching data-parsoid form RESTBase\n\trbOpts.uri = \"https://\" + domain + \"/api/rest_v1/page/data-parsoid/\" + Util.phpURLEncode(page) + \"/\" + etag;\n\tresp = yield ScriptUtils.retryingHTTPRequest(2, rbOpts);\n\n\t// RESTBase doesn't have the outer wrapper\n\t// that the parse.js script expects\n\tvar pb = '{\"parsoid\":' + resp[1] + \"}\";\n\tyield fs.writeFile(outputPrefix + \".pb.json\", pb, 'utf8');\n\n\tconsole.log(\"If you are debugging a bug report on a VE edit, make desired edit to the HTML file and save to a new file.\");\n\tconsole.log(\"Then run the following script to generated edited wikitext\");\n\tconsole.log(\"parse.js --html2wt --selser --oldtextfile \"\n\t\t+ outputPrefix + \".wt\"\n\t\t+ \" --oldhtmlfile \" + outputPrefix + \".html\"\n\t\t+ \" --pbinfile \" + outputPrefix + \".pb.json\"\n\t\t+ \" < edited.html > edited.wt\");\n});\n\nvar usage = 'Usage: $0 [options] --title <page-title> [--revid <rev-id>]\\n';\nvar yopts = yargs\n.usage(usage)\n.options({\n\t'config': {\n\t\tdescription: \"Path to a config.yaml file. Defaults to the server's config.yaml\",\n\t\t'default': true,\n\t},\n\t'prefix': {\n\t\tdescription: 'Which wiki prefix to use; e.g. \"enwiki\" for English wikipedia, \"eswiki\" for Spanish, \"mediawikiwiki\" for mediawiki.org',\n\t\t'boolean': false,\n\t\t'default': null,\n\t},\n\t'domain': {\n\t\tdescription: 'Which wiki to use; e.g. \"en.wikipedia.org\" for English wikipedia, \"es.wikipedia.org\" for Spanish, \"www.mediawiki.org\" for mediawiki.org',\n\t\t'boolean': false,\n\t\t'default': null,\n\t},\n\t'revid': {\n\t\tdescription: 'Page revision to fetch',\n\t\t'boolean': false,\n\t},\n\t'title': {\n\t\tdescription: 'Page title to fetch',\n\t\t'boolean': false,\n\t},\n});\n\nPromise.async(function *() {\n\tvar argv = yopts.argv;\n\tvar title = argv.title;\n\tvar error;\n\tif (!title) {\n\t\terror = \"Must specify a title.\";\n\t}\n\n\tif (argv.help || error) {\n\t\tif (error) {\n\t\t\t// Make the error standout in the output\n\t\t\tvar buf = [\"-------\"];\n\t\t\tfor (var i = 0; i < error.length; i++) {\n\t\t\t\tbuf.push(\"-\");\n\t\t\t}\n\t\t\tbuf = buf.join('');\n\t\t\tconsole.error(buf);\n\t\t\tconsole.error('ERROR:', error);\n\t\t\tconsole.error(buf);\n\t\t}\n\t\tyopts.showHelp();\n\t\treturn;\n\t}\n\n\tyield fetch(title, argv.revid, argv);\n})().done();\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tools/fetch-wmf-sitematrix.js","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tools/fetch-wt.js","messages":[{"ruleId":"node/no-missing-require","severity":2,"message":"\"../lib/mw/ApiRequest.js\" is not found.","line":20,"column":31,"nodeType":"Literal","endLine":20,"endColumn":56},{"ruleId":"node/no-missing-require","severity":2,"message":"\"../lib/config/MWParserEnvironment.js\" is not found.","line":22,"column":35,"nodeType":"Literal","endLine":22,"endColumn":73}],"errorCount":2,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"source":"#!/usr/bin/env node\n\n'use strict';\n\nrequire('../core-upgrade.js');\n\n/** Fetch the wikitext for a page, given title or revision id.\n *\n *  This is very useful for extracting test cases which can then be passed\n *  to tests/parse.js\n */\n\nvar fs = require('pn/fs');\nvar path = require('path');\nvar yargs = require('yargs');\nvar yaml = require('js-yaml');\n\nvar Promise = require('../lib/utils/promise.js');\n\nvar TemplateRequest = require('../lib/mw/ApiRequest.js').TemplateRequest;\nvar ParsoidConfig = require('../lib/config/ParsoidConfig.js').ParsoidConfig;\nvar MWParserEnvironment = require('../lib/config/MWParserEnvironment.js').MWParserEnvironment;\nvar ScriptUtils = require('./ScriptUtils.js').ScriptUtils;\n\nvar fetch = Promise.async(function *(page, revid, opts) {\n\tvar prefix = opts.prefix || null;\n\tvar domain = opts.domain || null;\n\n\tif (opts.apiURL) {\n\t\tprefix = 'customwiki';\n\t\tdomain = null;\n\t} else if (!(prefix || domain)) {\n\t\tdomain = 'en.wikipedia.org';\n\t}\n\n\tvar parsoidOptions = {\n\t\tloadWMF: opts.loadWMF,\n\t};\n\n\tif (ScriptUtils.booleanOption(opts.config)) {\n\t\tvar p = (typeof (opts.config) === 'string') ?\n\t\t\tpath.resolve('.', opts.config) :\n\t\t\tpath.resolve(__dirname, '../config.yaml');\n\t\t// Assuming Parsoid is the first service in the list\n\t\tparsoidOptions = yaml.load(yield fs.readFile(p, 'utf8')).services[0].conf;\n\t}\n\n\tScriptUtils.setTemplatingAndProcessingFlags(parsoidOptions, opts);\n\tScriptUtils.setDebuggingFlags(parsoidOptions, opts);\n\n\tif (parsoidOptions.localsettings) {\n\t\tparsoidOptions.localsettings = path.resolve(__dirname, parsoidOptions.localsettings);\n\t}\n\n\tvar pc = new ParsoidConfig(null, parsoidOptions);\n\tpc.defaultWiki = prefix || pc.getPrefixFor(domain);\n\n\tvar env = yield MWParserEnvironment.getParserEnv(pc, {\n\t\tprefix: prefix,\n\t\tdomain: domain,\n\t\tpageName: page,\n\t});\n\tvar target = page ?\n\t\tenv.normalizeAndResolvePageTitle() : null;\n\tyield TemplateRequest.setPageSrcInfo(env, target, revid);\n\n\tif (opts.output) {\n\t\tyield fs.writeFile(opts.output, env.page.src, 'utf8');\n\t} else {\n\t\tconsole.log(env.page.src);\n\t}\n});\n\nvar usage = 'Usage: $0 [options] <page-title or rev-id>\\n' +\n\t'If first argument is numeric, it is used as a rev id; otherwise it is\\n' +\n\t'used as a title.  Use the --title option for a numeric title.';\n\nvar yopts = yargs\n.usage(usage)\n.options(ScriptUtils.addStandardOptions({\n\t'output': {\n\t\tdescription: \"Write page to given file\",\n\t},\n\t'config': {\n\t\tdescription: \"Path to a config.yaml file.  Use --config w/ no argument to default to the server's config.yaml\",\n\t\t'default': false,\n\t},\n\t'prefix': {\n\t\tdescription: 'Which wiki prefix to use; e.g. \"enwiki\" for English wikipedia, \"eswiki\" for Spanish, \"mediawikiwiki\" for mediawiki.org',\n\t\t'boolean': false,\n\t\t'default': null,\n\t},\n\t'domain': {\n\t\tdescription: 'Which wiki to use; e.g. \"en.wikipedia.org\" for English wikipedia, \"es.wikipedia.org\" for Spanish, \"www.mediawiki.org\" for mediawiki.org',\n\t\t'boolean': false,\n\t\t'default': null,\n\t},\n\t'revid': {\n\t\tdescription: 'Page revision to fetch',\n\t\t'boolean': false,\n\t},\n\t'title': {\n\t\tdescription: 'Page title to fetch (only if revid is not present)',\n\t\t'boolean': false,\n\t},\n\t'loadWMF': {\n\t\tdescription: 'Use WMF mediawiki API config',\n\t\t'boolean': true,\n\t\t'default': true,\n\t},\n}));\n\nPromise.async(function *() {\n\tvar argv = yopts.argv;\n\tvar title = null;\n\tvar revid = null;\n\tvar error;\n\tif (argv.title && argv.revid) {\n\t\terror = \"Can't specify title and revid at the same time.\";\n\t} else if (argv.title) {\n\t\ttitle = '' + argv.title; // convert, in case it's numeric.\n\t} else if (argv.revid) {\n\t\trevid = +argv.revid;\n\t} else if (typeof (argv._[0]) === 'number') {\n\t\trevid = argv._[0];\n\t} else if (argv._[0]) {\n\t\ttitle = argv._[0];\n\t} else {\n\t\terror = \"Must specify a title or revision id.\";\n\t}\n\n\tif (argv.help || error) {\n\t\tif (error) {\n\t\t\t// Make the error standout in the output\n\t\t\tvar buf = [\"-------\"];\n\t\t\tfor (var i = 0; i < error.length; i++) {\n\t\t\t\tbuf.push(\"-\");\n\t\t\t}\n\t\t\tbuf = buf.join('');\n\t\t\tconsole.error(buf);\n\t\t\tconsole.error('ERROR:', error);\n\t\t\tconsole.error(buf);\n\t\t}\n\t\tyopts.showHelp();\n\t\treturn;\n\t}\n\n\tyield fetch(title, revid, argv);\n})().done();\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tools/fetch_ve_nowiki_edits.js","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tools/regression-testing.js","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tools/runRtTests.js","messages":[],"errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tools/sync-baseconfig.js","messages":[{"ruleId":"node/no-missing-require","severity":2,"message":"\"../lib/mw/ApiRequest.js\" is not found.","line":17,"column":29,"nodeType":"Literal","endLine":17,"endColumn":54},{"ruleId":"node/no-missing-require","severity":2,"message":"\"../lib/config/MWParserEnvironment.js\" is not found.","line":18,"column":35,"nodeType":"Literal","endLine":18,"endColumn":73}],"errorCount":2,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"source":"#!/usr/bin/env node\n\n'use strict';\n\nrequire('../core-upgrade.js');\n\n/**\n * Fetch the siteconfig for a set of wikis.\n * See: lib/config/baseconfig/README\n */\n\nvar fs = require('pn/fs');\nvar path = require('path');\nvar yargs = require('yargs');\nvar yaml = require('js-yaml');\n\nvar ConfigRequest = require('../lib/mw/ApiRequest.js').ConfigRequest;\nvar MWParserEnvironment = require('../lib/config/MWParserEnvironment.js').MWParserEnvironment;\nvar ParsoidConfig = require('../lib/config/ParsoidConfig.js').ParsoidConfig;\nvar Promise = require('../lib/utils/promise.js');\nvar ScriptUtils = require('./ScriptUtils.js').ScriptUtils;\n\nvar update = Promise.async(function *(opts) {\n\tvar prefix = opts.prefix || null;\n\tvar domain = opts.domain || null;\n\n\tif (opts.apiURL) {\n\t\tprefix = 'customwiki';\n\t\tdomain = null;\n\t} else if (!(prefix || domain)) {\n\t\tdomain = 'en.wikipedia.org';\n\t}\n\n\tvar parsoidOptions = {\n\t\tloadWMF: true,\n\t\tfetchConfig: true,\n\t};\n\n\tif (ScriptUtils.booleanOption(opts.config)) {\n\t\tvar p = (typeof (opts.config) === 'string') ?\n\t\t\tpath.resolve('.', opts.config) :\n\t\t\tpath.resolve(__dirname, '../config.yaml');\n\t\t// Assuming Parsoid is the first service in the list\n\t\tparsoidOptions = yaml.load(yield fs.readFile(p, 'utf8')).services[0].conf;\n\t}\n\n\tScriptUtils.setTemplatingAndProcessingFlags(parsoidOptions, opts);\n\tScriptUtils.setDebuggingFlags(parsoidOptions, opts);\n\n\tif (parsoidOptions.localsettings) {\n\t\tparsoidOptions.localsettings = path.resolve(__dirname, parsoidOptions.localsettings);\n\t}\n\n\tvar pc = new ParsoidConfig(null, parsoidOptions);\n\tpc.defaultWiki = prefix || pc.getPrefixFor(domain);\n\n\tvar env = yield MWParserEnvironment.getParserEnv(pc, {\n\t\tprefix: prefix,\n\t\tdomain: domain,\n\t});\n\tvar resultConf = yield ConfigRequest.promise(env, opts.formatversion);\n\tvar configDir = path.resolve(__dirname, '..');\n\tvar iwp = env.conf.wiki.iwp;\n\t// HACK for be-tarask\n\tif (iwp === 'be_x_oldwiki') { iwp = 'be-taraskwiki'; }\n\tvar localConfigFile = path.resolve(\n\t\tconfigDir, `./baseconfig/${opts.formatversion === 2 ? '2/' : ''}${iwp}.json`\n\t);\n\tvar resultStr = JSON.stringify({ query: resultConf }, null, 2);\n\tyield fs.writeFile(localConfigFile, resultStr, 'utf8');\n\tconsole.log('Wrote', localConfigFile);\n});\n\nvar usage = 'Usage: $0 [options]\\n' +\n\t'Rewrites one cached siteinfo configuration.\\n' +\n\t'Use --domain or --prefix to select which one to rewrite.';\n\nvar yopts = yargs\n.usage(usage)\n.options(ScriptUtils.addStandardOptions({\n\t'config': {\n\t\tdescription: \"Path to a config.yaml file.  Use --config w/ no argument to default to the server's config.yaml\",\n\t\t'default': false,\n\t},\n\t'prefix': {\n\t\tdescription: 'Which wiki prefix to use; e.g. \"enwiki\" for English wikipedia, \"eswiki\" for Spanish, \"mediawikiwiki\" for mediawiki.org',\n\t\t'boolean': false,\n\t\t'default': null,\n\t},\n\t'domain': {\n\t\tdescription: 'Which wiki to use; e.g. \"en.wikipedia.org\" for English wikipedia, \"es.wikipedia.org\" for Spanish, \"www.mediawiki.org\" for mediawiki.org',\n\t\t'boolean': false,\n\t\t'default': null,\n\t},\n\t'formatversion': {\n\t\tdescription: 'Which formatversion to use',\n\t\t'boolean': false,\n\t\t'default': 1,\n\t}\n}));\n\nPromise.async(function *() {\n\tvar argv = yopts.argv;\n\tif (argv.help) {\n\t\tyopts.showHelp();\n\t\treturn;\n\t}\n\tyield update(argv);\n})().done();\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]},{"filePath":"/src/repo/tools/sync-parserTests.js","messages":[{"ruleId":"jsdoc/check-alignment","severity":1,"message":"Expected JSDoc block to be aligned.","line":8,"column":null,"nodeType":"Block","endLine":8,"endColumn":null,"fix":{"range":[68,2526],"text":"/**\n   == USAGE ==\n \n   Script to synchronize parsoid parserTests with parserTests in other repos.\n \n   Basic use:\n     $PARSOID is the path to a checked out git copy of Parsoid\n     $REPO is the path to a checked out git copy of the repo containing\n       the parserTest file. (Check the `repo` key in tests/parserTests.json)\n     $BRANCH is a branch name for the patch to $REPO (ie, 'ptsync-<date>')\n     $TARGET identifies which set of parserTests we're synchronizing.\n       (This should be one of the top-level keys in tests/parserTests.json)\n \n   $ cd $PARSOID\n   $ tools/sync-parserTests.js $REPO $BRANCH $TARGET\n   $ cd $REPO\n   $ git rebase master\n     ... resolve conflicts, sigh ...\n   $ php tests/parser/parserTests.php\n     ... fix any failures by marking tests parsoid-only, etc ...\n   $ git review\n \n     ... time passes, eventually your patch is merged to $REPO ...\n \n   $ cd $PARSOID\n   $ tools/fetch-parserTests.txt.js $TARGET --force\n   $ php bin/parserTests.php --updateKnownFailures\n   $ git add -u\n   $ git commit -m \"Sync parserTests with core\"\n   $ git review\n \n   Simple, right?\n \n   == WHY ==\n \n   There are two copies of parserTests files.\n \n   Since Parsoid & core are in different repositories and both Parsoid\n   and the legacy parser are still operational, we need a parserTests\n   file in each repository. They are usually in sync but since folks\n   are hacking both wikitext engines simultaneously, the two copies\n   might be modified independently. So, we need to periodically sync\n   them (which is just a multi-repo rebase).\n \n   We detect incompatible divergence of the two copies via CI. We run the\n   legacy parser against Parsoid's copy of the test file and test failures\n   indicate a divergence and necessitates a sync. We don't yet have CI\n   code that runs Parsoid against core's copy of the test file, but that\n   might come soon.\n \n   This discussion only touched upon tests/parser/parserTests.txt but\n   all of the same considerations apply to the parser test file for\n   extensions since we have a Parsoid-version and a legacy-parser version\n   of many extensions at this time.\n \n   == THINKING ==\n \n   The \"thinking\" part of the sync is to look at the patches created and\n   make sure that whatever change was made upstream (as shown in the diff\n   of the sync patch) doesn't require a corresponding change in Parsoid\n   and file a phab task and regenerate the known-differences list if that\n   happens to be the case.\n */"}},{"ruleId":"no-process-exit","severity":2,"message":"Don't use process.exit(); throw an error instead.","line":189,"column":2,"nodeType":"CallExpression","messageId":"noProcessExit","endLine":189,"endColumn":17}],"errorCount":1,"warningCount":1,"fixableErrorCount":0,"fixableWarningCount":1,"source":"#!/usr/bin/env node\n\n\"use strict\";\n\nrequire('../core-upgrade.js');\n\n/**\n   == USAGE ==\n\n   Script to synchronize parsoid parserTests with parserTests in other repos.\n\n   Basic use:\n     $PARSOID is the path to a checked out git copy of Parsoid\n     $REPO is the path to a checked out git copy of the repo containing\n       the parserTest file. (Check the `repo` key in tests/parserTests.json)\n     $BRANCH is a branch name for the patch to $REPO (ie, 'ptsync-<date>')\n     $TARGET identifies which set of parserTests we're synchronizing.\n       (This should be one of the top-level keys in tests/parserTests.json)\n\n   $ cd $PARSOID\n   $ tools/sync-parserTests.js $REPO $BRANCH $TARGET\n   $ cd $REPO\n   $ git rebase master\n     ... resolve conflicts, sigh ...\n   $ php tests/parser/parserTests.php\n     ... fix any failures by marking tests parsoid-only, etc ...\n   $ git review\n\n     ... time passes, eventually your patch is merged to $REPO ...\n\n   $ cd $PARSOID\n   $ tools/fetch-parserTests.txt.js $TARGET --force\n   $ php bin/parserTests.php --updateKnownFailures\n   $ git add -u\n   $ git commit -m \"Sync parserTests with core\"\n   $ git review\n\n   Simple, right?\n\n   == WHY ==\n\n   There are two copies of parserTests files.\n\n   Since Parsoid & core are in different repositories and both Parsoid\n   and the legacy parser are still operational, we need a parserTests\n   file in each repository. They are usually in sync but since folks\n   are hacking both wikitext engines simultaneously, the two copies\n   might be modified independently. So, we need to periodically sync\n   them (which is just a multi-repo rebase).\n\n   We detect incompatible divergence of the two copies via CI. We run the\n   legacy parser against Parsoid's copy of the test file and test failures\n   indicate a divergence and necessitates a sync. We don't yet have CI\n   code that runs Parsoid against core's copy of the test file, but that\n   might come soon.\n\n   This discussion only touched upon tests/parser/parserTests.txt but\n   all of the same considerations apply to the parser test file for\n   extensions since we have a Parsoid-version and a legacy-parser version\n   of many extensions at this time.\n\n   == THINKING ==\n\n   The \"thinking\" part of the sync is to look at the patches created and\n   make sure that whatever change was made upstream (as shown in the diff\n   of the sync patch) doesn't require a corresponding change in Parsoid\n   and file a phab task and regenerate the known-differences list if that\n   happens to be the case.\n */\n\nvar yargs = require('yargs');\nvar childProcess = require('pn/child_process');\nvar path = require('path');\nvar fs = require('pn/fs');\n\nvar Promise = require('../lib/utils/promise.js');\n\nvar testDir = path.join(__dirname, '../tests/');\nvar testFilesPath = path.join(testDir, 'parserTests.json');\nvar testFiles = require(testFilesPath);\n\nvar DEFAULT_TARGET = 'parserTests.txt';\n\nvar strip = function(s) {\n\treturn s.replace(/(^\\s+)|(\\s+$)/g, '');\n};\n\nPromise.async(function *() {\n\t// Option parsing and helpful messages.\n\tvar usage = 'Usage: $0 <repo path> <branch name> <target>';\n\tvar opts = yargs\n\t.usage(usage)\n\t.options({\n\t\t'help': { description: 'Show this message' },\n\t});\n\tvar argv = opts.argv;\n\tif (argv.help || argv._.length < 2 || argv._.length > 3) {\n\t\topts.showHelp();\n\t\tvar morehelp = yield fs.readFile(__filename, 'utf8');\n\t\tmorehelp = strip(morehelp.split(/== [A-Z]* ==/, 2)[1]);\n\t\tconsole.log(morehelp.replace(/^ {3}/mg, ''));\n\t\treturn;\n\t}\n\n\t// Ok, let's do this thing!\n\tvar mwpath = path.resolve(argv._[0]);\n\tvar branch = argv._[1];\n\tvar targetName = argv._[2] || DEFAULT_TARGET;\n\n\tif (!testFiles.hasOwnProperty(targetName)) {\n\t\tconsole.warn(targetName + ' not defined in parserTests.json');\n\t\treturn;\n\t}\n\n\tvar file = testFiles[targetName];\n\tvar oldhash = file.latestCommit;\n\n\tvar mwexec = function(cmd) {\n\t\t// Execute `cmd` in the mwpath directory.\n\t\treturn new Promise(function(resolve, reject) {\n\t\t\tconsole.log('>>>', cmd.join(' '));\n\t\t\tchildProcess.spawn(cmd[0], cmd.slice(1), {\n\t\t\t\tcwd: mwpath,\n\t\t\t\tenv: process.env,\n\t\t\t\tstdio: 'inherit',\n\t\t\t}).on('close', function(code) {\n\t\t\t\tif (code === 0) {\n\t\t\t\t\tresolve(code);\n\t\t\t\t} else {\n\t\t\t\t\treject(code);\n\t\t\t\t}\n\t\t\t}).on('error', reject);\n\t\t});\n\t};\n\n\tvar pPARSERTESTS = path.join(__dirname, '..', 'tests', 'parser', targetName);\n\tvar mwPARSERTESTS = path.join(mwpath, file.path);\n\n\t// Fetch current Parsoid git hash.\n\tvar result = yield childProcess.execFile(\n\t\t'git', ['log', '--max-count=1', '--pretty=format:%H'], {\n\t\t\tcwd: __dirname,\n\t\t\tenv: process.env,\n\t\t}).promise;\n\tvar phash = strip(result.stdout);\n\n\t// A bit of user-friendly logging.\n\tconsole.log('Parsoid git HEAD is', phash);\n\tconsole.log('>>> cd', mwpath);\n\n\t// Create a new mediawiki/core branch, based on the previous sync point.\n\tyield mwexec('git fetch origin'.split(' '));\n\tyield mwexec(['git', 'checkout', '-b', branch, oldhash]);\n\n\t// Copy our locally-modified parser tests over to mediawiki/core.\n\t// cp __dirname/tests/parser/parserTests.txt $mwpath/tests/parser\n\ttry {\n\t\tvar data = yield fs.readFile(pPARSERTESTS);\n\t\tconsole.log('>>>', 'cp', pPARSERTESTS, mwPARSERTESTS);\n\t\tyield fs.writeFile(mwPARSERTESTS, data);\n\t} catch (e) {\n\t\t// cleanup\n\t\tyield mwexec('git checkout master'.split(' '));\n\t\tyield mwexec(['git', 'branch', '-d', branch]);\n\t\tthrow e;\n\t}\n\n\t// Make a new mediawiki/core commit with an appropriate message.\n\tvar commitmsg = 'Sync up with Parsoid ' + targetName;\n\tcommitmsg += '\\n\\nThis now aligns with Parsoid commit ' + phash;\n\t// Note the --allow-empty, because sometimes there are no parsoid-side\n\t// changes to merge. (We just need to get changes from upstream.)\n\tyield mwexec(['git', 'commit', '-m', commitmsg, '--allow-empty', mwPARSERTESTS]);\n\n\t// ok, we were successful at making the commit.  Give further instructions.\n\tconsole.log();\n\tconsole.log('Success!  Now:');\n\tconsole.log(' cd', mwpath);\n\tconsole.log(' git rebase --keep-empty origin/master');\n\tconsole.log(' .. fix any conflicts .. ');\n\tconsole.log(' php tests/parser/parserTests.php');\n\tconsole.log(' git review');\n\n\t// XXX to rebase semi-automatically, we might do something like:\n\t//  yield mwexec('git rebase origin/master'.split(' '));\n\t// XXX but it seems rather confusing to do it this way, since the\n\t// current working directory when we finish is still parsoid.\n\n\tprocess.exit(0);\n})().done();\n","usedDeprecatedRules":[{"ruleId":"no-buffer-constructor","replacedBy":[]},{"ruleId":"no-new-require","replacedBy":[]},{"ruleId":"no-process-exit","replacedBy":[]}]}]

Traceback (most recent call last):
  File "/venv/lib/python3.7/site-packages/libup-0.0.1-py3.7.egg/libup/ng.py", line 1188, in main
    libup.run(args.repo, args.output, args.branch)
  File "/venv/lib/python3.7/site-packages/libup-0.0.1-py3.7.egg/libup/ng.py", line 1130, in run
    self.npm_upgrade(plan)
  File "/venv/lib/python3.7/site-packages/libup-0.0.1-py3.7.egg/libup/ng.py", line 854, in npm_upgrade
    hook(update)
  File "/venv/lib/python3.7/site-packages/libup-0.0.1-py3.7.egg/libup/ng.py", line 963, in _handle_eslint
    eslint_cfg = utils.load_ordered_json('.eslintrc.json')
  File "/venv/lib/python3.7/site-packages/libup-0.0.1-py3.7.egg/libup/utils.py", line 58, in load_ordered_json
    with open(fname) as f:
FileNotFoundError: [Errno 2] No such file or directory: '.eslintrc.json'

composer dependencies

Dependencies
Development dependencies

npm dependencies

Dependencies
Development dependencies

Logs

Source code is licensed under the AGPL.