Databricks Workspace: Making Big Data Frameworks Easy-to-Use
Databricks Workspace substantially simplifies the use of big data frameworks, in general, and Spark in particular, by delivering three powerful web-based applications: notebooks, dashboards, and a job launcher.
Notebooks: Currently, notebooks allow users to query and analyze data using Python, SQL, and Scala.
Dashboards: Dashboards are interactive, as every plot can depend on one or more variables. When these variables are updated, the query behind each plot is automatically re-executed, and the plot is regenerated.
Job Launcher: It allows users to launch arbitrary Spark jobs programmatically. Users can, for example, schedule jobs to be executed on a regular basis or when their input changes.