-
Notifications
You must be signed in to change notification settings - Fork 13.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Import is not working: An error occurred while importing dashboard: Import dashboard failed for an unknown reason #19222
Comments
Hello! Could you take some screenshots of what you are experiencing? |
2022-03-18 07:29:47 default[1-3-1qa] Error running import command |
So you get this toast before you upload a dashboard or after you put in a dashboard and then press import. |
Export(Download)working in first environment and then importing(Upload) in to second environment. I am getting error in "then importing(Upload) in to second environment." |
I am facing the same issue... but this happens only in case of multiple datasets |
ok makes sense, I will try to replicate this bug today. Could you tell me what engines you were running and if it is agnostic of the chart/viz types involved. |
Below is the latest error logs. ERROR:root:'NoneType' object has no attribute 'uuid' 2022-03-22 10:14:04 default[1-3-1noimc] "GET /api/v1/dashboard/export/?q=!(14)&token=zVVSihwxp HTTP/1.1" 500 |
I am having the same issue. I am getting the same error when trying to import even a single dataset. Importing a chart (when the linked dataset is exists) is not possible either. |
recreated report then it is working for me. |
Hello, trying to import datasets via UI from dev env (superset 1.5.0) to a new hosts with 2.0.0rc2
|
Having the same issue here |
I had the same issue, but it worked when I imported the .zip file, instead of the individual .yaml file. I guess the error that is raised should be more clear about it. |
We are trying to build automation to promote dashboards between environments and we're also seeing dashboard import failures. Ours turned out to be due to the This lead to this super hacky snippet in our pipeline, to unzip, remove expanded_slices, then reupload. Ultimately the right fix will be for the upload to either ignore or handle the expanded_slices field.
|
same issue, simple export from dashboard and then using CLI to import fails, makes it hard to get devops and CI working. |
I am also seeing this issue when importing both charts and dashboards, I'm unclear on the precise cause (not much that is useful from the logs I can see). I will note that I have a CI script that manages imports via the REST API and that has continued to work just fine. For anyone building automation around data/dashboard loading I would recommend that approach. |
@andy-clapson Could you share the script or the main parts of it with us too? Would be highly appreciated! |
Hey folks, I see this went silent around the time 3.0 came out. Is this still an issue? |
Hi @rusackas, Actually I tested it with version 3.0.0 back then, and the problem was still there. There was also some recent discussion regarding this in Slack: https://apache-superset.slack.com/archives/C015WAZL0KH/p1695889915182819 |
Try now with |
I'm expericing the sam issue here |
I'm experiencing the same issue while importing a single dataset. Getting error "An error occurred while importing dataset: Could not find a valid command to import file. Please re-export your file and try importing again" |
Is anyone experiencing this able to make an example export to share here that we can troubleshoot with? I haven't been able to reproduce the issue. It seems like if you make a virtual dataset that generates the data as part of the query, you might be safe to share the export? CC @betodealmeida in case they have any idea how to troubleshoot/repro/fix this. |
@rusackas @betodealmeida i get the same issue in 4.0.0,how can i resolve it. |
Hi @rusackas , I was able to reproduce this today as well, from 1.3 to 4.0.
View Logs
|
We had the same issue. Downgrading to 4.0.0 fixed the problem for us: export TAG=4.0.0
docker compose -f docker-compose-image-tag.yml up @rusackas, here are two example Dashboards, one created with the latest version of Superset and one after the workaround above (with tag 4.0). |
@rusackas Hello! I'm here to help you with any bugs, questions, or contributions while you wait for a human maintainer. Feel free to ask me anything. Dosu The error you're encountering during the dashboard import process in Apache Superset is likely due to issues with the To resolve this issue, you can try the following steps:
Here is a relevant code snippet from the class ImportDashboardsCommand(ImportModelsCommand):
"""Import dashboards"""
dao = DashboardDAO
model_name = "dashboard"
prefix = "dashboards/"
schemas: dict[str, Schema] = {
"charts/": ImportV1ChartSchema(),
"dashboards/": ImportV1DashboardSchema(),
"datasets/": ImportV1DatasetSchema(),
"databases/": ImportV1DatabaseSchema(),
}
import_error = DashboardImportError
@staticmethod
def _import(configs: dict[str, Any], overwrite: bool = False) -> None:
# discover charts and datasets associated with dashboards
chart_uuids: set[str] = set()
dataset_uuids: set[str] = set()
for file_name, config in configs.items():
if file_name.startswith("dashboards/"):
chart_uuids.update(find_chart_uuids(config["position"]))
dataset_uuids.update(
find_native_filter_datasets(config.get("metadata", {}))
)
# discover datasets associated with charts
for file_name, config in configs.items():
if file_name.startswith("charts/") and config["uuid"] in chart_uuids:
dataset_uuids.add(config["dataset_uuid"])
# discover databases associated with datasets
database_uuids: set[str] = set()
for file_name, config in configs.items():
if file_name.startswith("datasets/") and config["uuid"] in dataset_uuids:
database_uuids.add(config["database_uuid"])
# import related databases
database_ids: dict[str, int] = {}
for file_name, config in configs.items():
if file_name.startswith("databases/") and config["uuid"] in database_uuids:
database = import_database(config, overwrite=False)
database_ids[str(database.uuid)] = database.id
# import datasets with the correct parent ref
dataset_info: dict[str, dict[str, Any]] = {}
for file_name, config in configs.items():
if (
file_name.startswith("datasets/")
and config["database_uuid"] in database_ids
):
config["database_id"] = database_ids[config["database_uuid"]]
dataset = import_dataset(config, overwrite=False)
dataset_info[str(dataset.uuid)] = {
"datasource_id": dataset.id,
"datasource_type": dataset.datasource_type,
"datasource_name": dataset.table_name,
}
# import charts with the correct parent ref
charts = []
chart_ids: dict[str, int] = {}
for file_name, config in configs.items():
if (
file_name.startswith("charts/")
and config["dataset_uuid"] in dataset_info
):
# update datasource id, type, and name
dataset_dict = dataset_info[config["dataset_uuid"]]
config.update(dataset_dict)
dataset_uid = f"{dataset_dict['datasource_id']}__{dataset_dict['datasource_type']}"
config["params"].update({"datasource": dataset_uid})
if "query_context" in config:
config["query_context"] = None
chart = import_chart(config, overwrite=False)
charts.append(chart)
chart_ids[str(chart.uuid)] = chart.id
# store the existing relationship between dashboards and charts
existing_relationships = db.session.execute(
select([dashboard_slices.c.dashboard_id, dashboard_slices.c.slice_id])
).fetchall()
# import dashboards
dashboards: list[Dashboard] = []
dashboard_chart_ids: list[tuple[int, int]] = []
for file_name, config in configs.items():
if file_name.startswith("dashboards/"):
config = update_id_refs(config, chart_ids, dataset_info)
dashboard = import_dashboard(config, overwrite=overwrite)
dashboards.append(dashboard)
for uuid in find_chart_uuids(config["position"]):
if uuid not in chart_ids:
break
chart_id = chart_ids[uuid]
if (dashboard.id, chart_id) not in existing_relationships:
dashboard_chart_ids.append((dashboard.id, chart_id))
# set ref in the dashboard_slices table
values = [
{"dashboard_id": dashboard_id, "slice_id": chart_id}
for (dashboard_id, chart_id) in dashboard_chart_ids
]
db.session.execute(dashboard_slices.insert(), values)
# Migrate any filter-box charts to native dashboard filters.
for dashboard in dashboards:
migrate_dashboard(dashboard)
# Remove all obsolete filter-box charts.
for chart in charts:
if chart.viz_type == "filter_box":
db.session.delete(chart) This class will help you update the dashboard metadata to use the new IDs, ensuring that the references are correct and preventing errors related to missing or incorrect parameters. Additionally, it will handle the migration of filter-box charts to native dashboard filters, which is necessary due to the removal of the Filter Box code in version 4.0.0 [1][2][3][4].
|
Hi - I know this is probably really old, however, I made this work :- importing and export from v4.01 to 4.1rc2. Simply by moving all the files in the nested directory from the export zip to the root, then zipping and uploading to superset.. Hope this helps others. This attempted is a direct result from the helpful code above (Dosu). |
@wardsi Hi, some dashboards can work with this workaround, some do not. |
Hi Team,
I am trying to import dashboard getting below error.
An error occurred while importing dashboard: Import dashboard failed for an unknown reason
Note: when try to import i can see dataset added
when i try to import chart it is working.
I am able to import other reports
Regards,
Naren
The text was updated successfully, but these errors were encountered: