Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Import is not working: An error occurred while importing dashboard: Import dashboard failed for an unknown reason #19222

Open
Narendra678 opened this issue Mar 16, 2022 · 30 comments
Assignees
Labels
#bug Bug report need:more-info Requires more information from author

Comments

@Narendra678
Copy link

Hi Team,

I am trying to import dashboard getting below error.
An error occurred while importing dashboard: Import dashboard failed for an unknown reason

Note: when try to import i can see dataset added
when i try to import chart it is working.
I am able to import other reports

Regards,
Naren

@Narendra678 Narendra678 added the #bug Bug report label Mar 16, 2022
@AAfghahi
Copy link
Member

Hello! Could you take some screenshots of what you are experiencing?

@AAfghahi AAfghahi added the need:more-info Requires more information from author label Mar 17, 2022
@Narendra678
Copy link
Author

image

@Narendra678
Copy link
Author

2022-03-18 07:29:47 default[1-3-1qa] Error running import command
2022-03-18 07:29:47 default[1-3-1qa] Traceback (most recent call last): File "/workspace/superset/commands/importers/v1/init.py", line 67, in run self._import(db.session, self._configs, self.overwrite) File "/workspace/superset/dashboards/commands/importers/v1/init.py", line 126, in _import config = update_id_refs(config, chart_ids, dataset_info) File "/workspace/superset/dashboards/commands/importers/v1/utils.py", line 85, in update_id_refs for old_id, columns in metadata["filter_scopes"].items() File "/workspace/superset/dashboards/commands/importers/v1/utils.py", line 85, in for old_id, columns in metadata["filter_scopes"].items() KeyError: 580
2022-03-18 07:29:47 default[1-3-1qa]
2022-03-18 07:29:47 default[1-3-1qa] During handling of the above exception, another exception occurred:
2022-03-18 07:29:47 default[1-3-1qa]
2022-03-18 07:29:47 default[1-3-1qa] Traceback (most recent call last): File "/workspace/superset/dashboards/commands/importers/dispatcher.py", line 58, in run command.run() File "/workspace/superset/commands/importers/v1/init.py", line 71, in run raise self.import_error() superset.dashboards.commands.exceptions.DashboardImportError: Import dashboard failed for an unknown reason
2022-03-18 07:29:47 default[1-3-1qa] 2022-03-18 07:29:47,754:ERROR:superset.dashboards.commands.importers.dispatcher:Error running import command
2022-03-18 07:29:47 default[1-3-1qa] Traceback (most recent call last): File "/workspace/superset/commands/importers/v1/init.py", line 67, in run self._import(db.session, self._configs, self.overwrite) File "/workspace/superset/dashboards/commands/importers/v1/init.py", line 126, in _import config = update_id_refs(config, chart_ids, dataset_info) File "/workspace/superset/dashboards/commands/importers/v1/utils.py", line 85, in update_id_refs for old_id, columns in metadata["filter_scopes"].items() File "/workspace/superset/dashboards/commands/importers/v1/utils.py", line 85, in for old_id, columns in metadata["filter_scopes"].items() KeyError: 580
2022-03-18 07:29:47 default[1-3-1qa]
2022-03-18 07:29:47 default[1-3-1qa] During handling of the above exception, another exception occurred:
2022-03-18 07:29:47 default[1-3-1qa]
2022-03-18 07:29:47 default[1-3-1qa] Traceback (most recent call last): File "/workspace/superset/dashboards/commands/importers/dispatcher.py", line 58, in run command.run() File "/workspace/superset/commands/importers/v1/init.py", line 71, in run raise self.import_error() superset.dashboards.commands.exceptions.DashboardImportError: Import dashboard failed for an unknown reason
2022-03-18 07:29:47 default[1-3-1qa] Import dashboard failed for an unknown reason
2022-03-18 07:29:47 default[1-3-1qa] 2022-03-18 07:29:47,755:WARNING:superset.views.base:Import dashboard failed for an unknown reason

@AAfghahi
Copy link
Member

So you get this toast before you upload a dashboard or after you put in a dashboard and then press import.

@Narendra678
Copy link
Author

Export(Download)working in first environment and then importing(Upload) in to second environment.

I am getting error in "then importing(Upload) in to second environment."

@nerdyslacker
Copy link

I am facing the same issue... but this happens only in case of multiple datasets

@AAfghahi
Copy link
Member

ok makes sense, I will try to replicate this bug today. Could you tell me what engines you were running and if it is agnostic of the chart/viz types involved.

@Narendra678
Copy link
Author

ok makes sense, I will try to replicate this bug today. Could you tell me what engines you were running and if it is agnostic of the chart/viz types involved.

Below is the latest error logs. ERROR:root:'NoneType' object has no attribute 'uuid'

2022-03-22 10:14:04 default[1-3-1noimc] "GET /api/v1/dashboard/export/?q=!(14)&token=zVVSihwxp HTTP/1.1" 500
2022-03-22 10:14:05 default[1-3-1noimc] 2022-03-22 10:14:05,189:ERROR:root:'NoneType' object has no attribute 'uuid'
2022-03-22 10:14:05 default[1-3-1noimc] Traceback (most recent call last): File "/layers/google.python.pip/pip/lib/python3.7/site-packages/flask_appbuilder/api/init.py", line 85, in wraps return f(self, *args, **kwargs) File "/workspace/superset/views/base_api.py", line 85, in wraps raise ex File "/workspace/superset/views/base_api.py", line 82, in wraps duration, response = time_function(f, self, *args, **kwargs) File "/workspace/superset/utils/core.py", line 1429, in time_function response = func(*args, **kwargs) File "/layers/google.python.pip/pip/lib/python3.7/site-packages/flask_appbuilder/api/init.py", line 155, in wraps return f(self, *args, **kwargs) File "/workspace/superset/utils/log.py", line 241, in wrapper value = f(*args, **kwargs) File "/workspace/superset/dashboards/api.py", line 718, in export requested_ids File "/workspace/superset/commands/export.py", line 61, in run for file_name, file_content in self._export(model): File "/workspace/superset/dashboards/commands/export.py", line 140, in _export target["datasetUuid"] = str(dataset.uuid) AttributeError: 'NoneType' object has no attribute 'uuid'
2022-03-22 10:14:44 default[1-3-1noimc] "GET /dashboard/list/?pageIndex=0&sortColumn=changed_on_delta_humanized&sortOrder=desc&viewMode=table HTTP/1.1" 200
2022-03-22 10:14:46 default[1-3-1noimc] "GET /api/v1/dashboard/related/created_by?q=() HTTP/1.1" 200
2022-03-22 10:14:46 default[1-3-1noimc] "GET /api/v1/dashboard/related/owners?q=() HTTP/1.1" 200
2022-03-22 10:14:47 default[1-3-1noimc] "GET /api/v1/dashboard/?q=(order_column:changed_on_delta_humanized,order_direction:desc,page:0,page_size:25) HTTP/1.1" 200
2022-03-22 10:14:48 default[1-3-1noimc] "GET /api/v1/dashboard/_info?q=(keys:!(permissions)) HTTP/1.1" 200
2022-03-22 10:14:48 default[1-3-1noimc] "GET /static/assets/images/favicon.png HTTP/1.1" 304
2022-03-22 10:14:48 default[1-3-1noimc] "GET /api/v1/dashboard/favorite_status/?q=!(47,50,52,48,41,55,10,29,14,59,56,6,53,51,46,11,42,43,45,33,19,44,40,38,20) HTTP/1.1" 200

@smetirem
Copy link

smetirem commented Apr 13, 2022

I am having the same issue. I am getting the same error when trying to import even a single dataset. Importing a chart (when the linked dataset is exists) is not possible either.

@Narendra678
Copy link
Author

recreated report then it is working for me.

@yiyux
Copy link

yiyux commented Jul 8, 2022

Hello, trying to import datasets via UI from dev env (superset 1.5.0) to a new hosts with 2.0.0rc2
Same error here

superset_app | 2022-07-08 21:14:09,879:INFO:superset.datasets.commands.importers.v0:Importing dataset from file 20220708_200853.yaml superset_app | Error running import command superset_app | Traceback (most recent call last): superset_app | File "/app/superset/datasets/commands/importers/dispatcher.py", line 57, in run superset_app | command.run() superset_app | File "/app/superset/datasets/commands/importers/v0.py", line 266, in run superset_app | params = json.loads(dataset["params"]) superset_app | KeyError: 'params' superset_app | 2022-07-08 21:14:09,880:ERROR:superset.datasets.commands.importers.dispatcher:Error running import command superset_app | Traceback (most recent call last): superset_app | File "/app/superset/datasets/commands/importers/dispatcher.py", line 57, in run superset_app | command.run() superset_app | File "/app/superset/datasets/commands/importers/v0.py", line 266, in run superset_app | params = json.loads(dataset["params"]) superset_app | KeyError: 'params' superset_app | 'params' superset_app | Traceback (most recent call last): superset_app | File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1516, in full_dispatch_request superset_app | rv = self.dispatch_request() superset_app | File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1502, in dispatch_request superset_app | return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args) superset_app | File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/security/decorators.py", line 89, in wraps superset_app | return f(self, *args, **kwargs) superset_app | File "/app/superset/views/base_api.py", line 113, in wraps superset_app | raise ex superset_app | File "/app/superset/views/base_api.py", line 110, in wraps superset_app | duration, response = time_function(f, self, *args, **kwargs) superset_app | File "/app/superset/utils/core.py", line 1507, in time_function superset_app | response = func(*args, **kwargs) superset_app | File "/app/superset/utils/log.py", line 245, in wrapper superset_app | value = f(*args, **kwargs) superset_app | File "/app/superset/views/base_api.py", line 98, in wraps superset_app | return f(self, *args, **kwargs) superset_app | File "/app/superset/datasets/api.py", line 766, in import_ superset_app | command.run() superset_app | File "/app/superset/datasets/commands/importers/dispatcher.py", line 68, in run superset_app | raise exc superset_app | File "/app/superset/datasets/commands/importers/dispatcher.py", line 57, in run superset_app | command.run() superset_app | File "/app/superset/datasets/commands/importers/v0.py", line 266, in run superset_app | params = json.loads(dataset["params"]) superset_app | KeyError: 'params' superset_app | 2022-07-08 21:14:09,881:ERROR:superset.views.base:'params' superset_app | Traceback (most recent call last): superset_app | File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1516, in full_dispatch_request superset_app | rv = self.dispatch_request() superset_app | File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1502, in dispatch_request superset_app | return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args) superset_app | File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/security/decorators.py", line 89, in wraps superset_app | return f(self, *args, **kwargs) superset_app | File "/app/superset/views/base_api.py", line 113, in wraps superset_app | raise ex superset_app | File "/app/superset/views/base_api.py", line 110, in wraps superset_app | duration, response = time_function(f, self, *args, **kwargs) superset_app | File "/app/superset/utils/core.py", line 1507, in time_function superset_app | response = func(*args, **kwargs) superset_app | File "/app/superset/utils/log.py", line 245, in wrapper superset_app | value = f(*args, **kwargs) superset_app | File "/app/superset/views/base_api.py", line 98, in wraps superset_app | return f(self, *args, **kwargs) superset_app | File "/app/superset/datasets/api.py", line 766, in import_ superset_app | command.run() superset_app | File "/app/superset/datasets/commands/importers/dispatcher.py", line 68, in run superset_app | raise exc superset_app | File "/app/superset/datasets/commands/importers/dispatcher.py", line 57, in run superset_app | command.run() superset_app | File "/app/superset/datasets/commands/importers/v0.py", line 266, in run superset_app | params = json.loads(dataset["params"]) superset_app | KeyError: 'params'

@lanzhenw
Copy link

Having the same issue here

@javiermas
Copy link

I had the same issue, but it worked when I imported the .zip file, instead of the individual .yaml file. I guess the error that is raised should be more clear about it.

@wulfuric
Copy link

We are trying to build automation to promote dashboards between environments and we're also seeing dashboard import failures. Ours turned out to be due to the expanded_slices key that the importer couldn't handle, in our case it was pointing at a chart no longer part of the dashboard. No idea how it got there, maybe through the ui. Ultimately if we unzipped the dashboards, removed the key yq -i e 'del(.metadata.expanded_slices)' $f it would import.

This lead to this super hacky snippet in our pipeline, to unzip, remove expanded_slices, then reupload. Ultimately the right fix will be for the upload to either ignore or handle the expanded_slices field.

          cd assets/dashboards
          if [[ ! -f *.zip ]] ; then
            echo 'no dashboards marked for upload'
            exit
          fi
          unzip '*.zip'
          rm -rf *.zip
          # rm -rf ./*/databases
          ls
          yq --help
          yq --version
          for f in ./*/dashboards/*.yaml; do
            yq -i e 'del(.metadata.expanded_slices)' $f
          done
          for f in ./*/; do
              if [ -d "$f" ]; then
                  # Will not run if no directories are available
                  echo $f
                  zip -r "${f%/}.zip" "$f"
              fi

@terrancesnyder
Copy link

same issue, simple export from dashboard and then using CLI to import fails, makes it hard to get devops and CI working.

@andy-clapson
Copy link
Contributor

I am also seeing this issue when importing both charts and dashboards, I'm unclear on the precise cause (not much that is useful from the logs I can see).

I will note that I have a CI script that manages imports via the REST API and that has continued to work just fine. For anyone building automation around data/dashboard loading I would recommend that approach.

@pyryjook
Copy link

pyryjook commented Sep 27, 2023

@andy-clapson Could you share the script or the main parts of it with us too? Would be highly appreciated!

@rusackas
Copy link
Member

rusackas commented Mar 8, 2024

Hey folks, I see this went silent around the time 3.0 came out. Is this still an issue?

@pyryjook
Copy link

pyryjook commented Mar 8, 2024

Hi @rusackas,

Actually I tested it with version 3.0.0 back then, and the problem was still there.

There was also some recent discussion regarding this in Slack: https://apache-superset.slack.com/archives/C015WAZL0KH/p1695889915182819

@butuzov
Copy link

butuzov commented Mar 26, 2024

Try now with 3.1, I wasn't able to even track DashboardImportError as it looks like a dead code now...

@Davidkramer1999
Copy link

I'm expericing the sam issue here

@malikvashu29
Copy link

malikvashu29 commented Apr 9, 2024

I'm experiencing the same issue while importing a single dataset. Getting error "An error occurred while importing dataset: Could not find a valid command to import file. Please re-export your file and try importing again"

@rusackas
Copy link
Member

rusackas commented Apr 9, 2024

Is anyone experiencing this able to make an example export to share here that we can troubleshoot with? I haven't been able to reproduce the issue. It seems like if you make a virtual dataset that generates the data as part of the query, you might be safe to share the export? CC @betodealmeida in case they have any idea how to troubleshoot/repro/fix this.

@marituhone
Copy link

marituhone commented Apr 25, 2024

@rusackas @betodealmeida i get the same issue in 4.0.0,how can i resolve it.

@1yuv
Copy link
Contributor

1yuv commented Apr 25, 2024

Hi @rusackas , I was able to reproduce this today as well, from 1.3 to 4.0.
Steps:

  1. Export multiple datasets (physical/virtual) from older version, this gives you an yaml file.
  2. Try importing that data sets on 4.0 superset.
  3. You see an error like this:
image
View Logs
Importing dataset from file 20240425_165955.yaml
2024-04-25 17:09:48,511:INFO:superset.commands.dataset.importers.v0:Importing dataset from file 20240425_165955.yaml
Error running import command
Traceback (most recent call last):
  File "/app/superset/commands/dataset/importers/dispatcher.py", line 57, in run
    command.run()
  File "/app/superset/commands/dataset/importers/v0.py", line 255, in run
    params = json.loads(dataset["params"])
KeyError: 'params'
2024-04-25 17:09:48,511:ERROR:superset.commands.dataset.importers.dispatcher:Error running import command
Traceback (most recent call last):
  File "/app/superset/commands/dataset/importers/dispatcher.py", line 57, in run
    command.run()
  File "/app/superset/commands/dataset/importers/v0.py", line 255, in run
    params = json.loads(dataset["params"])
KeyError: 'params'
'params'
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/flask/app.py", line 1823, in full_dispatch_request
    rv = self.dispatch_request()
  File "/usr/local/lib/python3.10/site-packages/flask/app.py", line 1799, in dispatch_request
    return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
  File "/usr/local/lib/python3.10/site-packages/flask_appbuilder/security/decorators.py", line 95, in wraps
    return f(self, *args, **kwargs)
  File "/app/superset/views/base_api.py", line 127, in wraps
    raise ex
  File "/app/superset/views/base_api.py", line 121, in wraps
    duration, response = time_function(f, self, *args, **kwargs)
  File "/app/superset/utils/core.py", line 1463, in time_function
    response = func(*args, **kwargs)
  File "/app/superset/utils/log.py", line 255, in wrapper
    value = f(*args, **kwargs)
  File "/app/superset/views/base_api.py", line 108, in wraps
    return f(self, *args, **kwargs)
  File "/app/superset/datasets/api.py", line 926, in import_
    command.run()
  File "/app/superset/commands/dataset/importers/dispatcher.py", line 68, in run
    raise exc
  File "/app/superset/commands/dataset/importers/dispatcher.py", line 57, in run
    command.run()
  File "/app/superset/commands/dataset/importers/v0.py", line 255, in run
    params = json.loads(dataset["params"])
KeyError: 'params'
2024-04-25 17:09:48,511:ERROR:superset.views.base:'params'
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/flask/app.py", line 1823, in full_dispatch_request
    rv = self.dispatch_request()
  File "/usr/local/lib/python3.10/site-packages/flask/app.py", line 1799, in dispatch_request
    return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
  File "/usr/local/lib/python3.10/site-packages/flask_appbuilder/security/decorators.py", line 95, in wraps
    return f(self, *args, **kwargs)
  File "/app/superset/views/base_api.py", line 127, in wraps
    raise ex
  File "/app/superset/views/base_api.py", line 121, in wraps
    duration, response = time_function(f, self, *args, **kwargs)
  File "/app/superset/utils/core.py", line 1463, in time_function
    response = func(*args, **kwargs)
  File "/app/superset/utils/log.py", line 255, in wrapper
    value = f(*args, **kwargs)
  File "/app/superset/views/base_api.py", line 108, in wraps
    return f(self, *args, **kwargs)
  File "/app/superset/datasets/api.py", line 926, in import_
    command.run()
  File "/app/superset/commands/dataset/importers/dispatcher.py", line 68, in run
    raise exc
  File "/app/superset/commands/dataset/importers/dispatcher.py", line 57, in run
    command.run()
  File "/app/superset/commands/dataset/importers/v0.py", line 255, in run
    params = json.loads(dataset["params"])
KeyError: 'params'

@constantin-huetterer
Copy link

constantin-huetterer commented May 29, 2024

We had the same issue. Downgrading to 4.0.0 fixed the problem for us:

export TAG=4.0.0
docker compose -f docker-compose-image-tag.yml up

@rusackas, here are two example Dashboards, one created with the latest version of Superset and one after the workaround above (with tag 4.0).

Broken export in latest Superset version

Working export (v. 4.0)

@rusackas
Copy link
Member

@dosu-bot

Copy link

dosubot bot commented May 29, 2024

@rusackas Hello! I'm here to help you with any bugs, questions, or contributions while you wait for a human maintainer. Feel free to ask me anything. Dosu

The error you're encountering during the dashboard import process in Apache Superset is likely due to issues with the filter_scopes in the dashboard's metadata. Specifically, the error KeyError: 580 indicates that the import process is trying to access a key in the filter_scopes dictionary that doesn't exist.

To resolve this issue, you can try the following steps:

  1. Check the filter_scopes in the Exported File:

    • Ensure that the filter_scopes in the exported dashboard file are correctly defined and do not reference any non-existent keys.
  2. Use the ImportDashboardsCommand Class:

    • The ImportDashboardsCommand class in Superset handles the import process and ensures that the dashboard metadata and position references are correctly mapped to the new IDs. This class also handles the conversion of filter scopes and other necessary adjustments.

Here is a relevant code snippet from the ImportDashboardsCommand class that shows how it handles the import process:

class ImportDashboardsCommand(ImportModelsCommand):
    """Import dashboards"""

    dao = DashboardDAO
    model_name = "dashboard"
    prefix = "dashboards/"
    schemas: dict[str, Schema] = {
        "charts/": ImportV1ChartSchema(),
        "dashboards/": ImportV1DashboardSchema(),
        "datasets/": ImportV1DatasetSchema(),
        "databases/": ImportV1DatabaseSchema(),
    }
    import_error = DashboardImportError

    @staticmethod
    def _import(configs: dict[str, Any], overwrite: bool = False) -> None:
        # discover charts and datasets associated with dashboards
        chart_uuids: set[str] = set()
        dataset_uuids: set[str] = set()
        for file_name, config in configs.items():
            if file_name.startswith("dashboards/"):
                chart_uuids.update(find_chart_uuids(config["position"]))
                dataset_uuids.update(
                    find_native_filter_datasets(config.get("metadata", {}))
                )

        # discover datasets associated with charts
        for file_name, config in configs.items():
            if file_name.startswith("charts/") and config["uuid"] in chart_uuids:
                dataset_uuids.add(config["dataset_uuid"])

        # discover databases associated with datasets
        database_uuids: set[str] = set()
        for file_name, config in configs.items():
            if file_name.startswith("datasets/") and config["uuid"] in dataset_uuids:
                database_uuids.add(config["database_uuid"])

        # import related databases
        database_ids: dict[str, int] = {}
        for file_name, config in configs.items():
            if file_name.startswith("databases/") and config["uuid"] in database_uuids:
                database = import_database(config, overwrite=False)
                database_ids[str(database.uuid)] = database.id

        # import datasets with the correct parent ref
        dataset_info: dict[str, dict[str, Any]] = {}
        for file_name, config in configs.items():
            if (
                file_name.startswith("datasets/")
                and config["database_uuid"] in database_ids
            ):
                config["database_id"] = database_ids[config["database_uuid"]]
                dataset = import_dataset(config, overwrite=False)
                dataset_info[str(dataset.uuid)] = {
                    "datasource_id": dataset.id,
                    "datasource_type": dataset.datasource_type,
                    "datasource_name": dataset.table_name,
                }

        # import charts with the correct parent ref
        charts = []
        chart_ids: dict[str, int] = {}
        for file_name, config in configs.items():
            if (
                file_name.startswith("charts/")
                and config["dataset_uuid"] in dataset_info
            ):
                # update datasource id, type, and name
                dataset_dict = dataset_info[config["dataset_uuid"]]
                config.update(dataset_dict)
                dataset_uid = f"{dataset_dict['datasource_id']}__{dataset_dict['datasource_type']}"
                config["params"].update({"datasource": dataset_uid})
                if "query_context" in config:
                    config["query_context"] = None

                chart = import_chart(config, overwrite=False)
                charts.append(chart)
                chart_ids[str(chart.uuid)] = chart.id

        # store the existing relationship between dashboards and charts
        existing_relationships = db.session.execute(
            select([dashboard_slices.c.dashboard_id, dashboard_slices.c.slice_id])
        ).fetchall()

        # import dashboards
        dashboards: list[Dashboard] = []
        dashboard_chart_ids: list[tuple[int, int]] = []
        for file_name, config in configs.items():
            if file_name.startswith("dashboards/"):
                config = update_id_refs(config, chart_ids, dataset_info)
                dashboard = import_dashboard(config, overwrite=overwrite)
                dashboards.append(dashboard)
                for uuid in find_chart_uuids(config["position"]):
                    if uuid not in chart_ids:
                        break
                    chart_id = chart_ids[uuid]
                    if (dashboard.id, chart_id) not in existing_relationships:
                        dashboard_chart_ids.append((dashboard.id, chart_id))

        # set ref in the dashboard_slices table
        values = [
            {"dashboard_id": dashboard_id, "slice_id": chart_id}
            for (dashboard_id, chart_id) in dashboard_chart_ids
        ]
        db.session.execute(dashboard_slices.insert(), values)

        # Migrate any filter-box charts to native dashboard filters.
        for dashboard in dashboards:
            migrate_dashboard(dashboard)

        # Remove all obsolete filter-box charts.
        for chart in charts:
            if chart.viz_type == "filter_box":
                db.session.delete(chart)

This class will help you update the dashboard metadata to use the new IDs, ensuring that the references are correct and preventing errors related to missing or incorrect parameters. Additionally, it will handle the migration of filter-box charts to native dashboard filters, which is necessary due to the removal of the Filter Box code in version 4.0.0 [1][2][3][4].

Details

About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

@wardsi
Copy link

wardsi commented Sep 10, 2024

Hi - I know this is probably really old, however, I made this work :- importing and export from v4.01 to 4.1rc2. Simply by moving all the files in the nested directory from the export zip to the root, then zipping and uploading to superset.. Hope this helps others. This attempted is a direct result from the helpful code above (Dosu).

@SteveKhoa
Copy link

@wardsi Hi, some dashboards can work with this workaround, some do not.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
#bug Bug report need:more-info Requires more information from author
Projects
None yet
Development

No branches or pull requests