Compare commits

..

58 Commits

Author SHA1 Message Date
Michael Telatynski
b7052a5c5f Merge branch 'develop' of github.com:vector-im/element-web into t3chguy/ci3
 Conflicts:
	.editorconfig
	.github/workflows/preview_changelog.yaml
	.github/workflows/static_analysis.yaml
	.github/workflows/triage-labelled.yml
	.github/workflows/triage-priority-bugs.yml
	.gitignore
	scripts/layered.sh
2022-06-06 11:47:18 +01:00
Michael Telatynski
3c170bbe96 Switch to composite actions for pr_details and sonarqube (#22409)
* Switch to composite actions for pr_details and sonarqube

* Bring back a reusable workflow for element-web stack sonarqube runs
2022-06-06 11:37:44 +01:00
Michael Telatynski
2a9587d4ff Clean up closed issues (duplicates and rageshakes) (#22451) 2022-06-06 10:26:29 +01:00
Michael Telatynski
69426387dc Put web app team issues in the board (#22452) 2022-06-06 10:03:24 +01:00
Germain
76c9535255 Remove Z-IA issues to delight board automation (#22414) 2022-06-01 09:22:43 +00:00
Johannes Krude
97fb7f0235 document custom home view (#21066)
Co-authored-by: Travis Ralston <travisr@matrix.org>
2022-05-31 21:57:18 +00:00
Travis Ralston
41f05541ed Disable no-non-null assertions lint rule (#22348)
This *allows* us to use `variable!.prop` to ensure `variable` is not null/undefined.
2022-05-30 08:43:56 -06:00
Michael Telatynski
4a91c172b2 Github Actions pull_request synchronize runs on PR open anyway (#22396)
* Github Actions pull_request synchronize runs on PR open anyway

* Update pull_request.yaml
2022-05-30 15:40:33 +01:00
Kerry
7c8ded1526 Add /coverage to .gitignore (#22397)
* add coverage to gitignore

Signed-off-by: Kerry Archibald <kerrya@element.io>

* ignore coverage
2022-05-30 15:27:26 +02:00
Kerry
e92d44eb56 matrix-mock-request to 2.0 (#22395) 2022-05-30 13:20:50 +01:00
Michael Telatynski
70a247446e Add logo to readme badge (#22374) 2022-05-27 12:10:16 -04:00
Kerry
5ade461ea5 unit test getVectorConfig (#22373)
* test getconfig

Signed-off-by: Kerry Archibald <kerrya@element.io>

* whitespace

Signed-off-by: Kerry Archibald <kerrya@element.io>
2022-05-27 13:09:27 +00:00
Kerry
9df5bf17f4 unit test WebPlatform (#22371)
* test most version stuff

Signed-off-by: Kerry Archibald <kerrya@element.io>

* tidy

Signed-off-by: Kerry Archibald <kerrya@element.io>

* eof

Signed-off-by: Kerry Archibald <kerrya@element.io>
2022-05-27 10:30:13 +00:00
Robin
51ed7784d5 Show a dialog when Jitsi encounters an error (#22352) 2022-05-26 10:03:55 -04:00
Robin
359e0e205f Make Lao translation available (#22358) 2022-05-26 08:49:58 -04:00
Michael Telatynski
89bffd132a Fix gha concurrency conditions (#22360) 2022-05-26 10:21:43 +01:00
Robin
9c92f55afd Work around a Jitsi log handling crash (#22353) 2022-05-25 21:50:27 +00:00
Michael Telatynski
745140e9e7 Remove stale release.sh parameter for no-jsdoc (#22255) 2022-05-25 20:55:53 +00:00
RiotRobot
863e5f6c78 Reset matrix-react-sdk back to develop branch 2022-05-24 15:14:34 +01:00
RiotRobot
4e6836d00e Reset matrix-js-sdk back to develop branch 2022-05-24 15:14:21 +01:00
RiotRobot
1cdbcf29ce Merge branch 'master' into develop
# Conflicts:
#	yarn.lock
2022-05-24 15:13:38 +01:00
Michael Telatynski
7c949f9f5a Fix wrongly using github.ref in workflow_run actions which always refer to develop (#22321) 2022-05-24 12:36:15 +01:00
James Salter
11a3011cbd Option to disable hardware acceleration on Element Desktop (#22295)
Override ElectronPlatform to support disableHardwareAcceleration
2022-05-23 11:50:10 +01:00
Robin
6c7f663983 Stop Jitsi if we time out while connecting to a video room (#22301) 2022-05-20 16:25:31 -04:00
Michael Telatynski
fab52795e3 Consolidate i18n check into a reusable workflow (#22248)
* Fix i18n check bypass for RiotTranslateBot

* Consolidate i18n check into a reusable workflow
2022-05-20 01:17:34 +01:00
Michael Telatynski
b2d057b7c3 Update triage-priority-bugs.yml (#22277) 2022-05-19 14:54:38 +01:00
Robin
d36dcd2766 Patch Jitsi logs into rageshakes (#22270)
* Patch Jitsi logs into rageshakes

* Remove unused import

* Fix types
2022-05-19 09:24:39 +01:00
Germain
ff7398b21f Remove spaces to delight board automation (#22260) 2022-05-18 11:59:32 +01:00
RiotRobot
f906cc3067 Merge pull request #22253 from vector-im/actions/upgrade-deps
Upgrade dependencies
2022-05-17 20:10:51 +01:00
t3chguy
06349e4a9c [create-pull-request] automated change 2022-05-17 19:03:51 +00:00
Michael Telatynski
20cc77401c Revert "Sonarcloud check out upstream develop not fork develop (#22239)" (#22249)
This reverts commit 56f3afc7f8.
2022-05-17 18:27:52 +01:00
Michael Telatynski
56f3afc7f8 Sonarcloud check out upstream develop not fork develop (#22239) 2022-05-17 18:09:32 +01:00
Michael Telatynski
b18c944b42 strings 2022-05-03 09:22:18 +01:00
Michael Telatynski
0bbad38929 Must it really be single quotes only? 2022-05-03 09:07:47 +01:00
Michael Telatynski
a431363768 Add CI to prevent i18n non-EN changes 2022-05-03 09:04:30 +01:00
Michael Telatynski
2d7bada5c4 Tweak 2022-04-27 23:09:35 +01:00
Michael Telatynski
0e22503860 Reusable workflows don't help us here 2022-04-27 23:07:06 +01:00
Michael Telatynski
a340358551 🤦 2022-04-27 23:01:00 +01:00
Michael Telatynski
e53f2695a4 Fix reusable workflow call 2022-04-27 22:56:46 +01:00
Michael Telatynski
0a239a30fa Update fetchdep to also take into account forks 2022-04-27 22:50:28 +01:00
Michael Telatynski
dab1488ffe Fix PR_NUMBER env variable 2022-04-27 22:43:51 +01:00
Michael Telatynski
61f8ec5e10 Test 2022-04-27 22:41:12 +01:00
Michael Telatynski
7a01c6c61b Update deploy_develop.yaml 2022-04-27 22:31:03 +01:00
Michael Telatynski
0ab7cd05c7 Consolidate the two secrets environments 2022-04-27 22:27:00 +01:00
Michael Telatynski
fa28d2400b Fix layered.sh 2022-04-27 22:11:23 +01:00
Michael Telatynski
7cc52e68d1 Consolidate workflows by means of reuse 2022-04-27 17:23:06 +01:00
Michael Telatynski
8bdd965122 Use fetchdep everywhere 2022-04-27 17:12:34 +01:00
Michael Telatynski
418de7998a Switch up develop redeploy script to work off Github Actions 2022-04-27 16:48:02 +01:00
Michael Telatynski
008889d2a8 Split back out into two workflows for artifact access 2022-04-27 13:29:14 +01:00
Michael Telatynski
b577d0f2f2 Fix usage of wrong action 2022-04-27 13:15:16 +01:00
Michael Telatynski
98733057a7 Attempt to use deployments more properly 2022-04-27 13:10:39 +01:00
Michael Telatynski
0cd6e02c99 Iterate 2022-04-27 12:41:46 +01:00
Michael Telatynski
449a0e64d9 Test 2022-04-27 12:38:09 +01:00
Michael Telatynski
89b3e4aaab Lets try this 2022-04-27 08:38:12 +01:00
Michael Telatynski
fe8c583e09 Attempt all of the CI 2022-04-27 07:53:30 +01:00
Michael Telatynski
5adf38c87f temporarily increase trigger 2022-04-26 18:19:19 +01:00
Michael Telatynski
59d7265e69 Test deployment hook 2022-04-26 18:17:47 +01:00
Michael Telatynski
446b510b82 Delint workflow scripts 2022-04-26 18:17:31 +01:00
28 changed files with 1511 additions and 962 deletions

View File

@@ -30,6 +30,8 @@ module.exports = {
// We disable this while we're transitioning
"@typescript-eslint/no-explicit-any": "off",
// We're okay with assertion errors when we ask for them
"@typescript-eslint/no-non-null-assertion": "off",
// Ban matrix-js-sdk/src imports in favour of matrix-js-sdk/src/matrix imports to prevent unleashing hell.
"no-restricted-imports": ["error", {

33
.github/workflows/build.yaml vendored Normal file
View File

@@ -0,0 +1,33 @@
name: Build
on:
pull_request: { }
push:
branches: [ master ]
# develop pushes and repository_dispatch handled in build_develop.yaml
env:
# These must be set for fetchdep.sh to get the right branch
REPOSITORY: ${{ github.repository }}
PR_NUMBER: ${{ github.event.pull_request.number }}
jobs:
build:
name: "Build"
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v3
with:
cache: 'yarn'
- name: Install Dependencies
run: "./scripts/layered.sh"
- name: Build & Package
run: "./scripts/ci_package.sh"
- name: Upload Artifact
uses: actions/upload-artifact@v2
with:
name: previewbuild
path: dist/*.tar.gz
retention-days: 28

38
.github/workflows/build_develop.yaml vendored Normal file
View File

@@ -0,0 +1,38 @@
# Separate to the main build workflow for access to develop
# environment secrets, largely similar to build.yaml.
name: Build develop
on:
push:
branches: [ develop ]
repository_dispatch:
types: [ element-web-notify ]
jobs:
build:
name: "Build & Upload source maps to Sentry"
runs-on: ubuntu-latest
environment: develop
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v3
with:
cache: 'yarn'
- name: Install Dependencies
run: "./scripts/layered.sh"
- name: Build, Package & Upload sourcemaps
run: "./scripts/ci_package.sh"
env:
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
SENTRY_DSN: ${{ secrets.SENTRY_DSN }}
SENTRY_URL: ${{ secrets.SENTRY_URL }}
SENTRY_ORG: sentry
SENTRY_PROJECT: riot-web
- name: Upload Artifact
uses: actions/upload-artifact@v2
with:
name: previewbuild
path: dist/*.tar.gz
retention-days: 1

64
.github/workflows/deploy_develop.yaml vendored Normal file
View File

@@ -0,0 +1,64 @@
# Triggers after the Build has finished,
# because artifacts are not externally available
# until the end of their workflow.
name: Deploy develop.element.io
concurrency: deploy_develop
on:
workflow_run:
workflows: [ "Build develop" ]
types:
- completed
jobs:
deploy:
runs-on: ubuntu-latest
environment: develop
if: github.event.workflow_run.conclusion == 'success'
steps:
- name: Find Artifact ID
uses: actions/github-script@v3.1.0
id: find_artifact
with:
result-encoding: string
script: |
const artifacts = await github.actions.listWorkflowRunArtifacts({
owner: context.repo.owner,
repo: context.repo.repo,
run_id: ${{ github.event.workflow_run.id }},
});
const matchArtifact = artifacts.data.artifacts.filter((artifact) => {
return artifact.name == "previewbuild"
})[0];
const download = await github.actions.downloadArtifact({
owner: context.repo.owner,
repo: context.repo.repo,
artifact_id: matchArtifact.id,
archive_format: 'zip',
});
return download.url;
- name: Create Deployment
uses: bobheadxi/deployments@v1
id: deployment
with:
step: start
token: ${{ secrets.GITHUB_TOKEN }}
env: Develop
ref: ${{ github.head_ref }}
- name: Notify the redeploy script
uses: distributhor/workflow-webhook@v2
env:
webhook_url: ${{ secrets.DEVELOP_DEPLOY_WEBHOOK_URL }}
webhook_secret: ${{ secrets.DEVELOP_DEPLOY_WEBHOOK_SECRET }}
data: '{"url": "${{ steps.find_artifact.outputs.result }}"}'
- name: Update deployment status
uses: bobheadxi/deployments@v1
if: always()
with:
step: finish
token: ${{ secrets.GITHUB_TOKEN }}
status: ${{ job.status }}
env: ${{ steps.deployment.outputs.env }}
deployment_id: ${{ steps.deployment.outputs.deployment_id }}
env_url: https://develop.element.io

148
.github/workflows/issue_closed.yml vendored Normal file
View File

@@ -0,0 +1,148 @@
# For duplicate issues, ensure the close type is right (not planned), update it if not
# For all closed (completed) issues, cascade the closure onto any referenced rageshakes
# For all closed (not planned) issues, comment on rageshakes to move them into the canonical issue if one exists
on:
issues:
types: [ closed ]
jobs:
tidy:
name: Tidy closed issues
runs-on: ubuntu-latest
steps:
- uses: actions/github-script@v5
with:
# PAT needed as the GITHUB_TOKEN won't be able to see cross-references from other orgs (matrix-org)
github-token: ${{ secrets.ELEMENT_BOT_TOKEN }}
script: |
const variables = {
owner: context.repo.owner,
name: context.repo.repo,
number: context.issue.number,
};
const query = `query($owner:String!, $name:String!, $number:Int!) {
repository(owner: $owner, name: $name) {
issue(number: $number) {
stateReason,
timelineItems(first: 100, itemTypes: [MARKED_AS_DUPLICATE_EVENT, UNMARKED_AS_DUPLICATE_EVENT, CROSS_REFERENCED_EVENT]) {
edges {
node {
__typename
... on MarkedAsDuplicateEvent {
canonical {
... on Issue {
repository {
nameWithOwner
}
number
}
... on PullRequest {
repository {
nameWithOwner
}
number
}
}
}
... on UnmarkedAsDuplicateEvent {
canonical {
... on Issue {
repository {
nameWithOwner
}
number
}
... on PullRequest {
repository {
nameWithOwner
}
number
}
}
}
... on CrossReferencedEvent {
source {
... on Issue {
repository {
nameWithOwner
}
number
}
... on PullRequest {
repository {
nameWithOwner
}
number
}
}
}
}
}
}
}
}
}`;
const result = await github.graphql(query, variables);
const { stateReason, timelineItems: { edges } } = result.repository.issue;
const RAGESHAKE_OWNER = "matrix-org";
const RAGESHAKE_REPO = "element-web-rageshakes";
const rageshakes = new Set();
const duplicateOf = new Set();
console.log("Edges: ", JSON.stringify(edges));
for (const { node } of edges) {
switch(node.__typename) {
case "MarkedAsDuplicateEvent":
duplicateOf.add(node.canonical.repository.nameWithOwner + "#" + node.canonical.number);
break;
case "UnmarkedAsDuplicateEvent":
duplicateOf.remove(node.canonical.repository.nameWithOwner + "#" + node.canonical.number);
break;
case "CrossReferencedEvent":
if (node.source.repository.nameWithOwner === (RAGESHAKE_OWNER + "/" + RAGESHAKE_REPO)) {
rageshakes.add(node.source.number);
}
break;
}
}
console.log("Duplicate of: ", duplicateOf);
console.log("Found rageshakes: ", rageshakes);
if (duplicateOf.size) {
const body = Array.from(duplicateOf).join("\n");
// Comment on all rageshakes to create relationship to the issue this was closed as duplicate of
for (const rageshake of rageshakes) {
github.rest.issues.createComment({
owner: RAGESHAKE_OWNER,
repo: RAGESHAKE_REPO,
issue_number: rageshake,
body,
});
}
// Duplicate was closed with wrong reason, fix it
if (stateReason === "COMPLETED") {
await github.graphql(`mutation($id:ID!) {
closeIssue(input: { issueId:$id, stateReason:NOT_PLANNED }) {
clientMutationId
}
}`, {
id: context.payload.issue.node_id,
});
}
} else {
// This issue was closed, close all related rageshakes
for (const rageshake of rageshakes) {
github.rest.issues.update({
owner: RAGESHAKE_OWNER,
repo: RAGESHAKE_REPO,
issue_number: rageshake,
state: "closed",
});
}
}

View File

@@ -2,6 +2,7 @@ name: Pull Request
on:
pull_request_target:
types: [ opened, edited, labeled, unlabeled, synchronize ]
concurrency: ${{ github.workflow }}-${{ github.event.pull_request.head.ref }}
jobs:
changelog:
name: Preview Changelog

View File

@@ -5,33 +5,11 @@ on:
types:
- completed
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
group: ${{ github.workflow }}-${{ github.event.workflow_run.head_branch }}
cancel-in-progress: true
jobs:
prdetails:
name: PR Details
if: github.event.workflow_run.conclusion == 'success' && github.event.workflow_run.event == 'pull_request'
uses: matrix-org/matrix-js-sdk/.github/workflows/pr_details.yml@develop
with:
owner: ${{ github.event.workflow_run.head_repository.owner.login }}
branch: ${{ github.event.workflow_run.head_branch }}
sonarqube:
name: 🩻 SonarQube
needs: prdetails
# Only wait for prdetails if it isn't skipped
if: |
always() &&
(needs.prdetails.result == 'success' || needs.prdetails.result == 'skipped') &&
github.event.workflow_run.conclusion == 'success'
uses: matrix-org/matrix-js-sdk/.github/workflows/sonarcloud.yml@develop
with:
repo: ${{ github.event.workflow_run.head_repository.full_name }}
pr_id: ${{ needs.prdetails.outputs.pr_id }}
head_branch: ${{ needs.prdetails.outputs.head_branch || github.event.workflow_run.head_branch }}
base_branch: ${{ needs.prdetails.outputs.base_branch }}
revision: ${{ github.event.workflow_run.head_sha }}
coverage_workflow_name: tests.yml
coverage_run_id: ${{ github.event.workflow_run.id }}
secrets:
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}

View File

@@ -28,39 +28,7 @@ jobs:
i18n_lint:
name: "i18n Check"
runs-on: ubuntu-latest
permissions:
pull-requests: read
steps:
- uses: actions/checkout@v2
- name: "Get modified files"
id: changed_files
if: github.event_name == 'pull_request'
uses: tj-actions/changed-files@v19
with:
files: |
src/i18n/strings/*
files_ignore: |
src/i18n/strings/en_EN.json
- name: "Assert only en_EN was modified"
if: |
github.event_name == 'pull_request' &&
github.actor != 'RiotTranslateBot' &&
steps.changed_files.outputs.any_modified == 'true'
run: |
echo "You can only modify en_EN.json, do not touch any of the other i18n files as Weblate will be confused"
exit 1
- uses: actions/setup-node@v3
with:
cache: 'yarn'
# Does not need branch matching as only analyses this layer
- name: Install Deps
run: "yarn install --pure-lockfile"
- name: i18n Check
run: "yarn run diff-i18n"
uses: matrix-org/matrix-react-sdk/.github/workflows/i18n_check.yml@develop
js_lint:
name: "ESLint"

27
.github/workflows/test.yaml vendored Normal file
View File

@@ -0,0 +1,27 @@
name: Test
on:
pull_request: { }
push:
branches: [ master, develop ]
repository_dispatch:
types: [ element-web-notify ]
env:
# These must be set for fetchdep.sh to get the right branch
REPOSITORY: ${{ github.repository }}
PR_NUMBER: ${{ github.event.pull_request.number }}
jobs:
test:
name: "Test"
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v3
with:
cache: 'yarn'
- name: Install Dependencies
run: "./scripts/layered.sh"
- name: Run Tests
run: "yarn test"

18
.github/workflows/triage-assigned.yml vendored Normal file
View File

@@ -0,0 +1,18 @@
name: Move issued assigned to specific team members to their boards
on:
issues:
types: [ assigned ]
jobs:
web-app-team:
runs-on: ubuntu-latest
if: |
contains(github.event.issue.assignees.*.login, 't3chguy') ||
contains(github.event.issue.assignees.*.login, 'turt2live')
steps:
- uses: alex-page/github-project-automation-plus@bb266ff4dde9242060e2d5418e120a133586d488
with:
project: Web App Team
column: Ready
repo-token: ${{ secrets.ELEMENT_BOT_TOKEN }}

View File

@@ -2,7 +2,7 @@ name: Move labelled issues to correct projects
on:
issues:
types: [ labeled ]
types: [labeled]
jobs:
apply_Z-Labs_label:
@@ -100,11 +100,7 @@ jobs:
runs-on: ubuntu-latest
if: >
contains(github.event.issue.labels.*.name, 'A-New-Search-Experience') ||
contains(github.event.issue.labels.*.name, 'A-Spaces') ||
contains(github.event.issue.labels.*.name, 'A-Space-Settings') ||
contains(github.event.issue.labels.*.name, 'A-Subspaces') ||
contains(github.event.issue.labels.*.name, 'Team: Delight') ||
contains(github.event.issue.labels.*.name, 'Z-IA') ||
contains(github.event.issue.labels.*.name, 'Z-NewUserJourney')
steps:
- uses: octokit/graphql-action@v2.x

View File

@@ -2,35 +2,9 @@ name: Move P1 bugs to boards
on:
issues:
types: [ labeled, unlabeled ]
types: [labeled, unlabeled]
jobs:
p1_issues_to_team_workboard:
runs-on: ubuntu-latest
if: >
(!contains(github.event.issue.labels.*.name, 'A-E2EE') &&
!contains(github.event.issue.labels.*.name, 'A-E2EE-Cross-Signing') &&
!contains(github.event.issue.labels.*.name, 'A-E2EE-Dehydration') &&
!contains(github.event.issue.labels.*.name, 'A-E2EE-Key-Backup') &&
!contains(github.event.issue.labels.*.name, 'A-E2EE-SAS-Verification') &&
!contains(github.event.issue.labels.*.name, 'A-Spaces') &&
!contains(github.event.issue.labels.*.name, 'A-Spaces-Settings') &&
!contains(github.event.issue.labels.*.name, 'A-Subspaces')) &&
(contains(github.event.issue.labels.*.name, 'T-Defect') &&
contains(github.event.issue.labels.*.name, 'S-Critical') &&
(contains(github.event.issue.labels.*.name, 'O-Frequent') ||
contains(github.event.issue.labels.*.name, 'O-Occasional')) ||
contains(github.event.issue.labels.*.name, 'S-Major') &&
contains(github.event.issue.labels.*.name, 'O-Frequent') ||
contains(github.event.issue.labels.*.name, 'A11y') &&
contains(github.event.issue.labels.*.name, 'O-Frequent'))
steps:
- uses: alex-page/github-project-automation-plus@bb266ff4dde9242060e2d5418e120a133586d488
with:
project: Web App Team
column: P1
repo-token: ${{ secrets.ELEMENT_BOT_TOKEN }}
P1_issues_to_crypto_team_workboard:
runs-on: ubuntu-latest
if: >

3
.gitignore vendored
View File

@@ -23,3 +23,6 @@ electron/pub
.vscode
.vscode/
.env
/coverage
/scripts/extracted/
/scripts/latest

View File

@@ -1,4 +1,4 @@
[![Chat](https://img.shields.io/matrix/element-web:matrix.org)](https://matrix.to/#/#element-web:matrix.org)
[![Chat](https://img.shields.io/matrix/element-web:matrix.org?logo=matrix)](https://matrix.to/#/#element-web:matrix.org)
![Tests](https://github.com/vector-im/element-web/actions/workflows/tests.yaml/badge.svg)
![Static Analysis](https://github.com/vector-im/element-web/actions/workflows/static_analysis.yaml/badge.svg)
[![Weblate](https://translate.element.io/widgets/element-web/-/element-web/svg-badge.svg)](https://translate.element.io/engage/element-web/)

View File

@@ -193,7 +193,7 @@ Starting with `branding`, the following subproperties are available:
`welcome.html` that ships with Element will be used instead.
2. `home_url`: A URL to an HTML page to show within the app as the "home" page. When the app doesn't have a room/screen to
show the user, it will use the home page instead. The home page is additionally accessible from the user menu. By default,
no home page is set and therefore a hardcoded landing screen is used.
no home page is set and therefore a hardcoded landing screen is used. More documentation and examples are [here](./custom-home.md).
3. `login_for_welcome`: When `true` (default `false`), the app will use the login form as a welcome page instead of the welcome
page itself. This disables use of `welcome_url` and all welcome page functionality.

65
docs/custom-home.md Normal file
View File

@@ -0,0 +1,65 @@
# Custom Home Page
The home page is shown whenever the user is logged in, but no room is selected.
A custom `home.html` replacing the default home page can be configured either in `.well-known/matrix/client` or `config.json`.
Such a custom home page can be used to communicate helpful information and important rules to the users.
## Configuration
To provide a custom home page for all element-web/desktop users of a homeserver, include the following in `.well-known/matrix/client`:
```
{
"io.element.embedded_pages": {
"home_url": "https://example.org/home.html"
}
}
```
The home page can be overridden in `config.json` to provide all users of an element-web installation with the same experience:
```
{
"embeddedPages": {
"homeUrl": "https://example.org/home.html"
}
}
```
## `home.html` Example
The following is a simple example for a custom `home.html`:
```
<style type="text/css">
.tos {
width: auto;
color: black;
background : #ffcccb;
font-weight: bold;
}
</style>
<h1>The example.org Matrix Server</h1>
<div class="tos">
<p>Behave appropriately.</p>
</div>
<h2>Start Chatting</h2>
<ul>
<li><a href="#/dm">Send a Direct Message</a></li>
<li><a href="#/directory">Explore Public Rooms</a></li>
<li><a href="#/new">Create a Group Chat</a></li>
</ul>
```
When choosing colors, be aware that the home page may be displayed in either light or dark mode.
It may be needed to set CORS headers for the `home.html` to enable element-desktop to fetch it, with e.g., the following nginx config:
```
add_header Access-Control-Allow-Origin *;
```

View File

@@ -58,8 +58,8 @@
"gfm.css": "^1.1.2",
"jsrsasign": "^10.2.0",
"katex": "^0.12.0",
"matrix-js-sdk": "18.0.0",
"matrix-react-sdk": "3.45.0",
"matrix-js-sdk": "github:matrix-org/matrix-js-sdk#develop",
"matrix-react-sdk": "github:matrix-org/matrix-react-sdk#develop",
"matrix-widget-api": "^0.1.0-beta.18",
"prop-types": "^15.7.2",
"react": "17.0.2",
@@ -123,7 +123,7 @@
"jest-sonar-reporter": "^2.0.0",
"json-loader": "^0.5.7",
"loader-utils": "^1.4.0",
"matrix-mock-request": "^1.2.3",
"matrix-mock-request": "^2.0.0",
"matrix-react-test-utils": "^0.2.3",
"matrix-web-i18n": "^1.2.0",
"mini-css-extract-plugin": "^0.12.0",

View File

@@ -43,7 +43,7 @@ do
fi
done
./node_modules/matrix-js-sdk/release.sh -n -z "$orig_args"
./node_modules/matrix-js-sdk/release.sh -n "$orig_args"
release="${1#v}"
tag="v${release}"

View File

@@ -35,6 +35,7 @@ const INCLUDE_LANGS = [
{'value': 'ja', 'label': '日本語'},
{'value': 'kab', 'label': 'Taqbaylit'},
{'value': 'ko', 'label': '한국어'},
{'value': 'lo', 'label': 'ລາວ'},
{'value': 'lt', 'label': 'Lietuvių'},
{'value': 'lv', 'label': 'Latviešu'},
{'value': 'nb_NO', 'label': 'Norwegian Bokmål'},

View File

@@ -1,7 +1,7 @@
#!/usr/bin/env python
#
# download and unpack a element-web tarball.
#
# Allows `bundles` to be extracted to a common directory, and a link to
# config.json to be added.
@@ -23,9 +23,11 @@ except ImportError:
# python2
from urllib import urlretrieve
class DeployException(Exception):
pass
def create_relative_symlink(linkname, target):
relpath = os.path.relpath(target, os.path.dirname(linkname))
print ("Symlink %s -> %s" % (linkname, relpath))
@@ -57,10 +59,11 @@ def move_bundles(source, dest):
else:
renames[os.path.join(source, f)] = dst
for (src, dst) in renames.iteritems():
for (src, dst) in renames.items():
print ("Move %s -> %s" % (src, dst))
os.rename(src, dst)
class Deployer:
def __init__(self):
self.packages_path = "."
@@ -100,7 +103,7 @@ class Deployer:
print ("Extracted into: %s" % extracted_dir)
if self.symlink_paths:
for link_path, file_path in self.symlink_paths.iteritems():
for link_path, file_path in self.symlink_paths.items():
create_relative_symlink(
target=file_path,
linkname=os.path.join(extracted_dir, link_path)
@@ -139,6 +142,7 @@ class Deployer:
print ("Done")
return local_filename
if __name__ == "__main__":
parser = argparse.ArgumentParser("Deploy a Riot build on a web server.")
parser.add_argument(

View File

@@ -1,99 +0,0 @@
#!/bin/bash
# Fetches the js-sdk and matrix-react-sdk dependencies for development
# or testing purposes
# If there exists a branch of that dependency with the same name as
# the branch the current checkout is on, use that branch. Otherwise,
# use develop.
set -ex
GIT_CLONE_ARGS=("$@")
[ -z "$defbranch" ] && defbranch="develop"
# clone a specific branch of a github repo
function clone() {
org=$1
repo=$2
branch=$3
# Chop 'origin' off the start as jenkins ends up using
# branches on the origin, but this doesn't work if we
# specify the branch when cloning.
branch=${branch#origin/}
if [ -n "$branch" ]
then
echo "Trying to use $org/$repo#$branch"
# Disable auth prompts: https://serverfault.com/a/665959
GIT_TERMINAL_PROMPT=0 git clone https://github.com/$org/$repo.git $repo --branch $branch \
"${GIT_CLONE_ARGS[@]}"
return $?
fi
return 1
}
function dodep() {
deforg=$1
defrepo=$2
rm -rf $defrepo
# Try the PR author's branch in case it exists on the deps as well.
# Try the target branch of the push or PR.
# Use the default branch as the last resort.
if [[ "$BUILDKITE" == true ]]; then
# If BUILDKITE_BRANCH is set, it will contain either:
# * "branch" when the author's branch and target branch are in the same repo
# * "author:branch" when the author's branch is in their fork
# We can split on `:` into an array to check.
BUILDKITE_BRANCH_ARRAY=(${BUILDKITE_BRANCH//:/ })
if [[ "${#BUILDKITE_BRANCH_ARRAY[@]}" == "2" ]]; then
prAuthor=${BUILDKITE_BRANCH_ARRAY[0]}
prBranch=${BUILDKITE_BRANCH_ARRAY[1]}
else
prAuthor=$deforg
prBranch=$BUILDKITE_BRANCH
fi
clone $prAuthor $defrepo $prBranch ||
clone $deforg $defrepo $BUILDKITE_PULL_REQUEST_BASE_BRANCH ||
clone $deforg $defrepo $defbranch ||
return $?
else
clone $deforg $defrepo $ghprbSourceBranch ||
clone $deforg $defrepo $GIT_BRANCH ||
clone $deforg $defrepo `git rev-parse --abbrev-ref HEAD` ||
clone $deforg $defrepo $defbranch ||
return $?
fi
echo "$defrepo set to branch "`git -C "$defrepo" rev-parse --abbrev-ref HEAD`
}
##############################
echo 'Setting up matrix-js-sdk'
dodep matrix-org matrix-js-sdk
pushd matrix-js-sdk
yarn link
yarn install --pure-lockfile
popd
yarn link matrix-js-sdk
##############################
echo 'Setting up matrix-react-sdk'
dodep matrix-org matrix-react-sdk
pushd matrix-react-sdk
yarn link
yarn link matrix-js-sdk
yarn install --pure-lockfile
popd
yarn link matrix-react-sdk
##############################

78
scripts/fetchdep.sh Executable file
View File

@@ -0,0 +1,78 @@
#!/bin/bash
set -x
deforg="$1"
defrepo="$2"
defbranch="$3"
[ -z "$defbranch" ] && defbranch="develop"
rm -r "$defrepo" || true
# A function that clones a branch of a repo based on the org, repo and branch
clone() {
org=$1
repo=$2
branch=$3
if [ -n "$branch" ]
then
echo "Trying to use $org/$repo#$branch"
# Disable auth prompts: https://serverfault.com/a/665959
GIT_TERMINAL_PROMPT=0 git clone https://github.com/$org/$repo.git $repo --branch "$branch" --depth 1 && exit 0
fi
}
# A function that gets info about a PR from the GitHub API based on its number
getPRInfo() {
number=$1
if [ -n "$number" ]; then
echo "Getting info about a PR with number $number"
apiEndpoint="https://api.github.com/repos/${REPOSITORY:-"vector-im/element-web"}/pulls/"
apiEndpoint+=$number
head=$(curl $apiEndpoint | jq -r '.head.label')
fi
}
# Some CIs don't give us enough info, so we just get the PR number and ask the
# GH API for more info - "fork:branch". Some give us this directly.
if [ -n "$BUILDKITE_BRANCH" ]; then
# BuildKite
head=$BUILDKITE_BRANCH
elif [ -n "$PR_NUMBER" ]; then
# GitHub
getPRInfo $PR_NUMBER
elif [ -n "$REVIEW_ID" ]; then
# Netlify
getPRInfo $REVIEW_ID
fi
# for forks, $head will be in the format "fork:branch", so we split it by ":"
# into an array. On non-forks, this has the effect of splitting into a single
# element array given ":" shouldn't appear in the head - it'll just be the
# branch name. Based on the results, we clone.
BRANCH_ARRAY=(${head//:/ })
TRY_ORG=$deforg
TRY_BRANCH=${BRANCH_ARRAY[0]}
if [[ "$head" == *":"* ]]; then
# ... but only match that fork if it's a real fork
if [ "${BRANCH_ARRAY[0]}" != "matrix-org" ]; then
TRY_ORG=${BRANCH_ARRAY[0]}
fi
TRY_BRANCH=${BRANCH_ARRAY[1]}
fi
clone ${TRY_ORG} $defrepo ${TRY_BRANCH}
# Try the target branch of the push or PR.
if [ -n "$GITHUB_BASE_REF" ]; then
clone $deforg $defrepo $GITHUB_BASE_REF
elif [ -n "$BUILDKITE_PULL_REQUEST_BASE_BRANCH" ]; then
clone $deforg $defrepo $BUILDKITE_PULL_REQUEST_BASE_BRANCH
fi
# Try HEAD which is the branch name in Netlify (not BRANCH which is pull/xxxx/head for PR builds)
clone $deforg $defrepo $HEAD
# Use the default branch as the last resort.
clone $deforg $defrepo $defbranch

View File

@@ -1,45 +1,40 @@
#!/usr/bin/env python
#
# auto-deploy script for https://develop.element.io
#
# Listens for buildkite webhook pokes (https://buildkite.com/docs/apis/webhooks)
# When it gets one, downloads the artifact from buildkite
# and deploys it as the new version.
#
# Listens for Github Action webhook pokes (https://github.com/marketplace/actions/workflow-webhook-action)
# When it gets one: downloads the artifact from github actions and deploys it as the new version.
# Requires the following python packages:
#
# - requests
# - flask
#
# - python-github-webhook
from __future__ import print_function
import json, requests, tarfile, argparse, os, errno
import argparse
import os
import errno
import time
import traceback
from urlparse import urljoin
import glob
import re
import shutil
import threading
from Queue import Queue
from flask import Flask, jsonify, request, abort
import glob
from io import BytesIO
from urllib.request import urlopen
from zipfile import ZipFile
from github_webhook import Webhook
from flask import Flask, abort
from deploy import Deployer, DeployException
app = Flask(__name__)
webhook = Webhook(app, endpoint="/")
deployer = None
arg_extract_path = None
arg_symlink = None
arg_webhook_token = None
arg_api_token = None
workQueue = Queue()
def create_symlink(source, linkname):
def create_symlink(source: str, linkname: str):
try:
os.symlink(source, linkname)
except OSError, e:
except OSError as e:
if e.errno == errno.EEXIST:
# atomic modification
os.symlink(source, linkname + ".tmp")
@@ -47,118 +42,43 @@ def create_symlink(source, linkname):
else:
raise e
def req_headers():
return {
"Authorization": "Bearer %s" % (arg_api_token,),
}
# Buildkite considers a poke to have failed if it has to wait more than 10s for
# data (any data, not just the initial response) and it normally takes longer than
# that to download an artifact from buildkite. Apparently there is no way in flask
# to finish the response and then keep doing stuff, so instead this has to involve
# threading. Sigh.
def worker_thread():
while True:
toDeploy = workQueue.get()
deploy_buildkite_artifact(*toDeploy)
@app.route("/", methods=["POST"])
def on_receive_buildkite_poke():
got_webhook_token = request.headers.get('X-Buildkite-Token')
if got_webhook_token != arg_webbook_token:
print("Denying request with incorrect webhook token: %s" % (got_webhook_token,))
abort(400, "Incorrect webhook token")
@webhook.hook(event_type="workflow_run")
def on_deployment(payload: dict):
repository = payload.get("repository")
if repository is None:
abort(400, "No 'repository' specified")
return
required_api_prefix = None
if arg_buildkite_org is not None:
required_api_prefix = 'https://api.buildkite.com/v2/organizations/%s' % (arg_buildkite_org,)
incoming_json = request.get_json()
if not incoming_json:
abort(400, "No JSON provided!")
return
print("Incoming JSON: %s" % (incoming_json,))
event = incoming_json.get("event")
if event is None:
abort(400, "No 'event' specified")
workflow = payload.get("workflow")
if repository is None:
abort(400, "No 'workflow' specified")
return
if event == 'ping':
print("Got ping request - responding")
return jsonify({'response': 'pong!'})
if event != 'build.finished':
print("Rejecting '%s' event")
abort(400, "Unrecognised event")
request_id = payload.get("requestID")
if request_id is None:
abort(400, "No 'request_id' specified")
return
build_obj = incoming_json.get("build")
if build_obj is None:
abort(400, "No 'build' object")
if arg_github_org is not None and not repository.startswith(arg_github_org):
print("Denying poke for repository with incorrect prefix: %s" % (repository,))
abort(400, "Invalid repository")
return
build_url = build_obj.get('url')
if build_url is None:
abort(400, "build has no url")
if arg_github_workflow is not None and workflow != arg_github_workflow:
print("Denying poke for incorrect workflow: %s" % (workflow,))
abort(400, "Incorrect workflow")
return
if required_api_prefix is not None and not build_url.startswith(required_api_prefix):
print("Denying poke for build url with incorrect prefix: %s" % (build_url,))
abort(400, "Invalid build url")
artifact_url = payload.get("data", {}).get("url")
if artifact_url is None:
abort(400, "No 'data.url' specified")
return
build_num = build_obj.get('number')
if build_num is None:
abort(400, "build has no number")
return
deploy_artifact(artifact_url, request_id)
pipeline_obj = incoming_json.get("pipeline")
if pipeline_obj is None:
abort(400, "No 'pipeline' object")
return
pipeline_name = pipeline_obj.get('name')
if pipeline_name is None:
abort(400, "pipeline has no name")
return
artifacts_url = build_url + "/artifacts"
artifacts_resp = requests.get(artifacts_url, headers=req_headers())
artifacts_resp.raise_for_status()
artifacts_array = artifacts_resp.json()
artifact_to_deploy = None
for artifact in artifacts_array:
if re.match(r"dist/.*.tar.gz", artifact['path']):
artifact_to_deploy = artifact
if artifact_to_deploy is None:
print("No suitable artifacts found")
return jsonify({})
# double paranoia check: make sure the artifact is on the right org too
if required_api_prefix is not None and not artifact_to_deploy['url'].startswith(required_api_prefix):
print("Denying poke for build url with incorrect prefix: %s" % (artifact_to_deploy['url'],))
abort(400, "Refusing to deploy artifact from URL %s", artifact_to_deploy['url'])
return
# there's no point building up a queue of things to deploy, so if there are any pending jobs,
# remove them
while not workQueue.empty():
try:
workQueue.get(False)
except:
pass
workQueue.put([artifact_to_deploy, pipeline_name, build_num])
return jsonify({})
def deploy_buildkite_artifact(artifact, pipeline_name, build_num):
artifact_response = requests.get(artifact['url'], headers=req_headers())
artifact_response.raise_for_status()
artifact_obj = artifact_response.json()
def deploy_artifact(artifact_url: str, request_id: str):
# we extract into a directory based on the build number. This avoids the
# problem of multiple builds building the same git version and thus having
# the same tarball name. That would lead to two potential problems:
@@ -166,58 +86,42 @@ def deploy_buildkite_artifact(artifact, pipeline_name, build_num):
# a good deploy with a bad one
# (b) we'll be overwriting the live deployment, which means people might
# see half-written files.
build_dir = os.path.join(arg_extract_path, "%s-#%s" % (pipeline_name, build_num))
try:
extracted_dir = deploy_tarball(artifact_obj, build_dir)
except DeployException as e:
traceback.print_exc()
abort(400, e.message)
build_dir = os.path.join(arg_extract_path, "gha-%s" % (request_id,))
create_symlink(source=extracted_dir, linkname=arg_symlink)
def deploy_tarball(artifact, build_dir):
"""Download a tarball from jenkins and unpack it
Returns:
(str) the path to the unpacked deployment
"""
if os.path.exists(build_dir):
raise DeployException(
"Not deploying. We have previously deployed this build."
)
# We have already deployed this, nop
return
os.mkdir(build_dir)
print("Fetching artifact %s -> %s..." % (artifact['download_url'], artifact['filename']))
# Download the tarball here as buildkite needs auth to do this
# we don't pgp-sign buildkite artifacts, relying on HTTPS and buildkite
# not being evil. If that's not good enough for you, don't use develop.element.io.
resp = requests.get(artifact['download_url'], stream=True, headers=req_headers())
resp.raise_for_status()
with open(artifact['filename'], 'wb') as ofp:
shutil.copyfileobj(resp.raw, ofp)
print("...download complete. Deploying...")
# we rely on the fact that flask only serves one request at a time to
# ensure that we do not overwrite a tarball from a concurrent request.
return deployer.deploy(artifact['filename'], build_dir)
try:
with urlopen(artifact_url) as f:
with ZipFile(BytesIO(f.read()), "r") as z:
name = next((x for x in z.namelist() if x.endswith(".tar.gz")))
z.extract(name, build_dir)
extracted_dir = deployer.deploy(os.path.join(build_dir, name), build_dir)
create_symlink(source=extracted_dir, linkname=arg_symlink)
except DeployException as e:
traceback.print_exc()
abort(400, str(e))
finally:
if deployer.should_clean:
os.remove(os.path.join(build_dir, name))
if __name__ == "__main__":
parser = argparse.ArgumentParser("Runs a Vector redeployment server.")
parser = argparse.ArgumentParser("Runs an Element redeployment server.")
parser.add_argument(
"-p", "--port", dest="port", default=4000, type=int, help=(
"The port to listen on for requests from Jenkins."
"The port to listen on for redeployment requests."
)
)
parser.add_argument(
"-e", "--extract", dest="extract", default="./extracted", help=(
"-e", "--extract", dest="extract", default="./extracted", type=str, help=(
"The location to extract .tar.gz files to."
)
)
parser.add_argument(
"-b", "--bundles-dir", dest="bundles_dir", help=(
"-b", "--bundles-dir", dest="bundles_dir", type=str, help=(
"A directory to move the contents of the 'bundles' directory to. A \
symlink to the bundles directory will also be written inside the \
extracted tarball. Example: './bundles'."
@@ -229,7 +133,7 @@ if __name__ == "__main__":
)
)
parser.add_argument(
"-s", "--symlink", dest="symlink", default="./latest", help=(
"-s", "--symlink", dest="symlink", default="./latest", type=str, help=(
"Write a symlink to this location pointing to the extracted tarball. \
New builds will keep overwriting this symlink. The symlink will point \
to the /vector directory INSIDE the tarball."
@@ -238,71 +142,65 @@ if __name__ == "__main__":
# --include ../../config.json ./localhost.json homepages/*
parser.add_argument(
"--include", nargs='*', default='./config*.json', help=(
"--include", nargs='*', default='./config*.json', type=str, help=(
"Symlink these files into the root of the deployed tarball. \
Useful for config files and home pages. Supports glob syntax. \
(Default: '%(default)s')"
)
)
parser.add_argument(
"--test", dest="tarball_uri", help=(
"Don't start an HTTP listener. Instead download a build from Jenkins \
immediately."
"--test", dest="tarball_uri", type=str, help=(
"Don't start an HTTP listener. Instead download a build from this URL immediately."
),
)
parser.add_argument(
"--webhook-token", dest="webhook_token", help=(
"Only accept pokes with this buildkite token."
), required=True,
)
parser.add_argument(
"--api-token", dest="api_token", help=(
"API access token for buildkite. Require read_artifacts scope."
"--webhook-token", dest="webhook_token", type=str, help=(
"Only accept pokes signed with this github token."
), required=True,
)
# We require a matching webhook token, but because we take everything else
# about what to deploy from the poke body, we can be a little more paranoid
# and only accept builds / artifacts from a specific buildkite org
# and only accept builds / artifacts from a specific github org
parser.add_argument(
"--org", dest="buildkite_org", help=(
"Lock down to this buildkite org"
"--org", dest="github_org", type=str, help=(
"Lock down to this github org"
)
)
# Optional matching workflow name
parser.add_argument(
"--workflow", dest="github_workflow", type=str, help=(
"Lock down to this github workflow"
)
)
args = parser.parse_args()
arg_extract_path = args.extract
arg_symlink = args.symlink
arg_webbook_token = args.webhook_token
arg_api_token = args.api_token
arg_buildkite_org = args.buildkite_org
arg_github_org = args.github_org
arg_github_workflow = args.github_workflow
if not os.path.isdir(arg_extract_path):
os.mkdir(arg_extract_path)
webhook.secret = args.webhook_token
deployer = Deployer()
deployer.bundles_path = args.bundles_dir
deployer.should_clean = args.clean
for include in args.include:
for include in args.include.split(" "):
deployer.symlink_paths.update({ os.path.basename(pth): pth for pth in glob.iglob(include) })
if args.tarball_uri is not None:
build_dir = os.path.join(arg_extract_path, "test-%i" % (time.time()))
deploy_tarball(args.tarball_uri, build_dir)
else:
print(
"Listening on port %s. Extracting to %s%s. Symlinking to %s. Include files: %s" %
(args.port,
arg_extract_path,
" (clean after)" if deployer.should_clean else "",
arg_symlink,
deployer.symlink_paths,
)
print(
"Listening on port %s. Extracting to %s%s. Symlinking to %s. Include files: %s" %
(args.port,
arg_extract_path,
" (clean after)" if deployer.should_clean else "",
arg_symlink,
deployer.symlink_paths,
)
fred = threading.Thread(target=worker_thread)
fred.daemon = True
fred.start()
app.run(port=args.port, debug=False)
)
app.run(port=args.port, debug=False)

View File

@@ -156,6 +156,15 @@ const ack = (ev: CustomEvent<IWidgetApiRequest>) => widgetApi.transport.reply(ev
ack(ev);
},
);
widgetApi.on(`action:${ElementWidgetActions.ForceHangupCall}`,
(ev: CustomEvent<IWidgetApiRequest>) => {
meetApi?.dispose();
notifyHangup();
meetApi = null;
closeConference();
ack(ev);
},
);
widgetApi.on(`action:${ElementWidgetActions.MuteAudio}`,
async (ev: CustomEvent<IWidgetApiRequest>) => {
ack(ev);
@@ -291,12 +300,12 @@ function createJWTToken() {
);
}
async function notifyHangup() {
async function notifyHangup(errorMessage?: string) {
if (widgetApi) {
// We send the hangup event before setAlwaysOnScreen, because the latter
// can cause the receiving side to instantly stop listening.
try {
await widgetApi.transport.send(ElementWidgetActions.HangupCall, {});
await widgetApi.transport.send(ElementWidgetActions.HangupCall, { errorMessage });
} finally {
await widgetApi.setAlwaysOnScreen(false);
}
@@ -357,6 +366,12 @@ function joinConference(audioDevice?: string, videoDevice?: string) {
startAudioOnly,
startWithAudioMuted: audioDevice == null,
startWithVideoMuted: videoDevice == null,
// Request some log levels for inclusion in rageshakes
// Ideally we would capture all possible log levels, but this can
// cause Jitsi Meet to try to post various circular data structures
// back over the iframe API, and therefore end up crashing
// https://github.com/jitsi/jitsi-meet/issues/11585
apiLogLevels: ["warn", "error"],
} as any,
jwt: jwt,
};
@@ -403,7 +418,7 @@ function joinConference(audioDevice?: string, videoDevice?: string) {
if (error.isFatal) {
// We got disconnected. Since Jitsi Meet might send us back to the
// prejoin screen, we're forced to act as if we hung up entirely.
notifyHangup();
notifyHangup(error.message);
meetApi = null;
closeConference();
}
@@ -411,7 +426,7 @@ function joinConference(audioDevice?: string, videoDevice?: string) {
meetApi.on("audioMuteStatusChanged", ({ muted }) => {
const action = muted ? ElementWidgetActions.MuteAudio : ElementWidgetActions.UnmuteAudio;
widgetApi.transport.send(action, {});
widgetApi?.transport.send(action, {});
});
meetApi.on("videoMuteStatusChanged", ({ muted }) => {
@@ -421,10 +436,10 @@ function joinConference(audioDevice?: string, videoDevice?: string) {
// otherwise the React SDK will mistakenly think the user turned off
// their video by hand
setTimeout(() => {
if (meetApi) widgetApi.transport.send(ElementWidgetActions.MuteVideo, {});
if (meetApi) widgetApi?.transport.send(ElementWidgetActions.MuteVideo, {});
}, 200);
} else {
widgetApi.transport.send(ElementWidgetActions.UnmuteVideo, {});
widgetApi?.transport.send(ElementWidgetActions.UnmuteVideo, {});
}
});
@@ -435,4 +450,9 @@ function joinConference(audioDevice?: string, videoDevice?: string) {
});
});
});
// Patch logs into rageshakes
meetApi.on("log", ({ logLevel, args }) =>
(parent as unknown as typeof global).mx_rage_logger?.log(logLevel, ...args),
);
}

View File

@@ -414,6 +414,18 @@ export default class ElectronPlatform extends VectorBasePlatform {
return this.ipcCall('setMinimizeToTrayEnabled', enabled);
}
public supportsTogglingHardwareAcceleration(): boolean {
return true;
}
public async getHardwareAccelerationEnabled(): Promise<boolean> {
return this.ipcCall('getHardwareAccelerationEnabled');
}
public async setHardwareAccelerationEnabled(enabled: boolean): Promise<void> {
return this.ipcCall('setHardwareAccelerationEnabled', enabled);
}
async canSelfUpdate(): Promise<boolean> {
const feedUrl = await this.ipcCall('getUpdateFeedUrl');
return Boolean(feedUrl);

View File

@@ -0,0 +1,122 @@
/*
Copyright 2022 The Matrix.org Foundation C.I.C.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import request from 'browser-request';
import { getVectorConfig } from "../../../src/vector/getconfig";
describe('getVectorConfig()', () => {
const setRequestMockImplementationOnce = (err?: unknown, response?: { status: number }, body?: string) =>
request.mockImplementationOnce((_opts, callback) => callback(err, response, body));
const prevDocumentDomain = document.domain;
const elementDomain = 'app.element.io';
const now = 1234567890;
const specificConfig = {
brand: 'specific',
}
const generalConfig = {
brand: 'general',
}
beforeEach(() => {
document.domain = elementDomain;
// stable value for cachebuster
jest.spyOn(Date, 'now').mockReturnValue(now);
jest.clearAllMocks();
});
afterAll(() => {
document.domain = prevDocumentDomain;
jest.spyOn(Date, 'now').mockRestore();
});
it('requests specific config for document domain', async () => {
setRequestMockImplementationOnce(undefined, { status: 200 }, JSON.stringify(specificConfig))
setRequestMockImplementationOnce(undefined, { status: 200 }, JSON.stringify(generalConfig))
await getVectorConfig();
expect(request.mock.calls[0][0]).toEqual({ method: "GET", url: 'config.app.element.io.json', qs: { cachebuster: now } })
});
it('adds trailing slash to relativeLocation when not an empty string', async () => {
setRequestMockImplementationOnce(undefined, { status: 200 }, JSON.stringify(specificConfig))
setRequestMockImplementationOnce(undefined, { status: 200 }, JSON.stringify(generalConfig))
await getVectorConfig('..');
expect(request.mock.calls[0][0]).toEqual(expect.objectContaining({ url: '../config.app.element.io.json' }))
expect(request.mock.calls[1][0]).toEqual(expect.objectContaining({ url: '../config.json' }))
});
it('returns parsed specific config when it is non-empty', async () => {
setRequestMockImplementationOnce(undefined, { status: 200 }, JSON.stringify(specificConfig))
setRequestMockImplementationOnce(undefined, { status: 200 }, JSON.stringify(generalConfig))
const result = await getVectorConfig();
expect(result).toEqual(specificConfig);
});
it('returns general config when specific config succeeds but is empty', async () => {
setRequestMockImplementationOnce(undefined, { status: 200 }, JSON.stringify({}))
setRequestMockImplementationOnce(undefined, { status: 200 }, JSON.stringify(generalConfig))
const result = await getVectorConfig();
expect(result).toEqual(generalConfig);
});
it('returns general config when specific config 404s', async () => {
setRequestMockImplementationOnce(undefined, { status: 404 })
setRequestMockImplementationOnce(undefined, { status: 200 }, JSON.stringify(generalConfig))
const result = await getVectorConfig();
expect(result).toEqual(generalConfig);
});
it('returns general config when specific config is fetched from a file and is empty', async () => {
setRequestMockImplementationOnce(undefined, { status: 0 }, '')
setRequestMockImplementationOnce(undefined, { status: 200 }, JSON.stringify(generalConfig))
const result = await getVectorConfig();
expect(result).toEqual(generalConfig);
});
it('returns general config when specific config returns a non-200 status', async () => {
setRequestMockImplementationOnce(undefined, { status: 401 })
setRequestMockImplementationOnce(undefined, { status: 200 }, JSON.stringify(generalConfig))
const result = await getVectorConfig();
expect(result).toEqual(generalConfig);
});
it('returns general config when specific config returns an error', async () => {
setRequestMockImplementationOnce('err1')
setRequestMockImplementationOnce(undefined, { status: 200 }, JSON.stringify(generalConfig))
const result = await getVectorConfig();
expect(result).toEqual(generalConfig);
});
it('rejects with an error when general config rejects', async () => {
setRequestMockImplementationOnce('err-specific');
setRequestMockImplementationOnce('err-general');
await expect(() => getVectorConfig()).rejects.toEqual({"err": "err-general", "response": undefined});
});
});

View File

@@ -0,0 +1,186 @@
/*
Copyright 2022 The Matrix.org Foundation C.I.C.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import request from 'browser-request';
import { UpdateCheckStatus } from 'matrix-react-sdk/src/BasePlatform';
import { MatrixClientPeg } from 'matrix-react-sdk/src/MatrixClientPeg';
import WebPlatform from '../../../../src/vector/platform/WebPlatform';
describe('WebPlatform', () => {
beforeEach(() => {
jest.clearAllMocks();
});
it('returns human readable name', () => {
const platform = new WebPlatform();
expect(platform.getHumanReadableName()).toEqual('Web Platform');
});
describe('notification support', () => {
const mockNotification = {
requestPermission: jest.fn(),
permission: 'notGranted',
}
beforeEach(() => {
// @ts-ignore
window.Notification = mockNotification;
mockNotification.permission = 'notGranted';
});
it('supportsNotifications returns false when platform does not support notifications', () => {
// @ts-ignore
window.Notification = undefined;
expect(new WebPlatform().supportsNotifications()).toBe(false);
});
it('supportsNotifications returns true when platform supports notifications', () => {
expect(new WebPlatform().supportsNotifications()).toBe(true);
});
it('maySendNotifications returns true when notification permissions are not granted', () => {
expect(new WebPlatform().maySendNotifications()).toBe(false);
});
it('maySendNotifications returns true when notification permissions are granted', () => {
mockNotification.permission = 'granted'
expect(new WebPlatform().maySendNotifications()).toBe(true);
});
it('requests notification permissions and returns result ', async () => {
mockNotification.requestPermission.mockImplementation(callback => callback('test'));
const platform = new WebPlatform();
const result = await platform.requestNotificationPermission();
expect(result).toEqual('test');
});
});
describe('app version', () => {
const envVersion = process.env.VERSION;
const prodVersion = '1.10.13';
const setRequestMockImplementation = (err?: unknown, response?: { status: number }, body?: string) =>
request.mockImplementation((_opts, callback) => callback(err, response, body));
beforeEach(() => {
jest.spyOn(MatrixClientPeg, 'userRegisteredWithinLastHours').mockReturnValue(false);
})
afterAll(() => {
process.env.VERSION = envVersion;
});
it('should return true from canSelfUpdate()', async () => {
const platform = new WebPlatform();
const result = await platform.canSelfUpdate();
expect(result).toBe(true);
});
it('getAppVersion returns normalized app version', async () => {
process.env.VERSION = prodVersion;
const platform = new WebPlatform();
const version = await platform.getAppVersion();
expect(version).toEqual(prodVersion);
process.env.VERSION = `v${prodVersion}`;
const version2 = await platform.getAppVersion();
// v prefix removed
expect(version2).toEqual(prodVersion);
process.env.VERSION = `version not like semver`;
const notSemverVersion = await platform.getAppVersion();
expect(notSemverVersion).toEqual(`version not like semver`);
});
describe('pollForUpdate()', () => {
it('should return not available and call showNoUpdate when current version matches most recent version', async () => {
process.env.VERSION = prodVersion;
setRequestMockImplementation(undefined, { status: 200}, prodVersion);
const platform = new WebPlatform();
const showUpdate = jest.fn();
const showNoUpdate = jest.fn();
const result = await platform.pollForUpdate(showUpdate, showNoUpdate);
expect(result).toEqual({ status: UpdateCheckStatus.NotAvailable });
expect(showUpdate).not.toHaveBeenCalled();
expect(showNoUpdate).toHaveBeenCalled();
});
it('should strip v prefix from versions before comparing', async () => {
process.env.VERSION = prodVersion;
setRequestMockImplementation(undefined, { status: 200}, `v${prodVersion}`);
const platform = new WebPlatform();
const showUpdate = jest.fn();
const showNoUpdate = jest.fn();
const result = await platform.pollForUpdate(showUpdate, showNoUpdate);
// versions only differ by v prefix, no update
expect(result).toEqual({ status: UpdateCheckStatus.NotAvailable });
expect(showUpdate).not.toHaveBeenCalled();
expect(showNoUpdate).toHaveBeenCalled();
});
it('should return ready and call showUpdate when current version differs from most recent version', async () => {
process.env.VERSION = '0.0.0'; // old version
setRequestMockImplementation(undefined, { status: 200}, prodVersion);
const platform = new WebPlatform();
const showUpdate = jest.fn();
const showNoUpdate = jest.fn();
const result = await platform.pollForUpdate(showUpdate, showNoUpdate);
expect(result).toEqual({ status: UpdateCheckStatus.Ready });
expect(showUpdate).toHaveBeenCalledWith('0.0.0', prodVersion);
expect(showNoUpdate).not.toHaveBeenCalled();
});
it('should return ready without showing update when user registered in last 24', async () => {
process.env.VERSION = '0.0.0'; // old version
jest.spyOn(MatrixClientPeg, 'userRegisteredWithinLastHours').mockReturnValue(true);
setRequestMockImplementation(undefined, { status: 200}, prodVersion);
const platform = new WebPlatform();
const showUpdate = jest.fn();
const showNoUpdate = jest.fn();
const result = await platform.pollForUpdate(showUpdate, showNoUpdate);
expect(result).toEqual({ status: UpdateCheckStatus.Ready });
expect(showUpdate).not.toHaveBeenCalled();
expect(showNoUpdate).not.toHaveBeenCalled();
});
it('should return error when version check fails', async () => {
setRequestMockImplementation('oups');
const platform = new WebPlatform();
const showUpdate = jest.fn();
const showNoUpdate = jest.fn();
const result = await platform.pollForUpdate(showUpdate, showNoUpdate);
expect(result).toEqual({ status: UpdateCheckStatus.Error, detail: 'Unknown Error' });
expect(showUpdate).not.toHaveBeenCalled();
expect(showNoUpdate).not.toHaveBeenCalled();
});
});
});
});

1142
yarn.lock

File diff suppressed because it is too large Load Diff