Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,8 @@
# dbt_fivetran_utils v0.4.11

## Feature Update
- Added SQL Server logic to the `wrap_in_quotes()` [macro](https://github.com/fivetran/dbt_fivetran_utils/tree/releases/v0.4.latest?tab=readme-ov-file#wrap_in_quotes-source) ([PR #138](https://github.com/fivetran/dbt_fivetran_utils/pull/138)).

# dbt_fivetran_utils v0.4.10

## Bug Fix
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -305,7 +305,7 @@ This macro allows for cross database timestamp difference calculation for BigQue

----
### try_cast ([source](macros/try_cast.sql))
This macro allows a field to be cast to a specified datatype. If the datatype is incompatible then a `null` value is provided. This macro is compatible with BigQuery, Redshift, Postgres, Snowflake, and Databricks.
This macro allows a field to be cast to a specified datatype. If the datatype is incompatible then a `null` value is provided. This macro is compatible with BigQuery, Redshift, Postgres, Snowflake, Databricks, and SQL Server.
> Please note: For Postgres and Redshift destinations the `numeric` datatype is only supported to try_cast.
**Usage:**
```sql
Expand All @@ -317,7 +317,7 @@ This macro allows a field to be cast to a specified datatype. If the datatype is

----
### wrap_in_quotes ([source](macros/wrap_in_quotes.sql))
This macro takes a SQL object (ie database, schema, column) and returns it wrapped in database-appropriate quotes (and casing for Snowflake).
This macro takes a SQL object (ie database, schema, column) and returns it wrapped in database-appropriate quotes (and casing for Snowflake). It is compatible with BigQuery, Snowflake, Postgres, Redshift, Databricks, and SQL Server.

**Usage:**
```sql
Expand Down Expand Up @@ -370,7 +370,7 @@ vars:

----
### fivetran_date_spine ([source](macros/fivetran_date_spine.sql))
This macro returns the sql required to build a date spine. The spine will include the `start_date` (if it is aligned to the `datepart`), but it will not include the `end_date`.
This macro returns the sql required to build a date spine. The spine will include the `start_date` (if it is aligned to the `datepart`), but it will not include the `end_date`. It is compatible with BigQuery, Snowflake, Redshift, Postgres, Databricks, and SQL Server.

For non-SQL Server databases, this will simply call [`dbt_utils.date_spine()`](https://github.com/dbt-labs/dbt-utils#date_spine-source). For SQL Server targets, this will manually create a spine, with code heavily leveraged from [`tsql-utils.date_spine()`](https://github.com/dbt-msft/tsql-utils/blob/main/macros/dbt_utils/datetime/date_spine.sql) but [adjusted for recent changes to dbt_utils](https://github.com/dbt-msft/tsql-utils/issues/96).

Expand Down
4 changes: 4 additions & 0 deletions macros/wrap_in_quotes.sql
Original file line number Diff line number Diff line change
Expand Up @@ -19,4 +19,8 @@

{%- macro postgres__wrap_in_quotes(object_to_quote) -%}
"{{ object_to_quote }}"
{%- endmacro -%}

{%- macro sqlserver__wrap_in_quotes(object_to_quote) -%}
"{{ object_to_quote }}"
{%- endmacro -%}