Title: | Connect to 'AWS Athena' using 'Boto3' ('DBI' Interface) |
---|---|
Description: | Designed to be compatible with the R package 'DBI' (Database Interface) when connecting to Amazon Web Service ('AWS') Athena <https://aws.amazon.com/athena/>. To do this 'Python' 'Boto3' Software Development Kit ('SDK') <https://boto3.amazonaws.com/v1/documentation/api/latest/index.html> is used as a driver. |
Authors: | Dyfan Jones [aut, cre] |
Maintainer: | Dyfan Jones <[email protected]> |
License: | MIT + file LICENSE |
Version: | 2.6.1 |
Built: | 2024-10-29 04:32:33 UTC |
Source: | https://github.com/DyfanJones/RAthena |
RAthena provides a seamless DBI interface into Athena using the python package Boto3.
The goal of the RAthena
package is to provide a DBI-compliant interface to Amazon’s Athena
using Boto3
software development kit (SDK). This allows for an efficient, easy setup connection to Athena using the Boto3
SDK as a driver.
Before starting with RAthena
, Python is require to be installed on the machine you are intending to run RAthena
.
As RAthena is using Boto3
as it's backend, AWS Command Line Interface (AWS CLI) can be used
to remove user credentials when interacting with Athena.
This allows AWS profile names to be set up so that RAthena can connect to different accounts from the same machine, without needing hard code any credentials.
Maintainer: Dyfan Jones [email protected]
Useful links:
Returns a set of temporary security credentials that you can use to access AWS resources that you might not normally have access to (link). These temporary credentials consist of an access key ID, a secret access key, and a security token. Typically, you use AssumeRole within your account or for cross-account access.
assume_role( profile_name = NULL, region_name = NULL, role_arn = NULL, role_session_name = sprintf("RAthena-session-%s", as.integer(Sys.time())), duration_seconds = 3600L, set_env = FALSE )
assume_role( profile_name = NULL, region_name = NULL, role_arn = NULL, role_session_name = sprintf("RAthena-session-%s", as.integer(Sys.time())), duration_seconds = 3600L, set_env = FALSE )
profile_name |
The name of a profile to use. If not given, then the default profile is used. To set profile name, the AWS Command Line Interface (AWS CLI) will need to be configured. To configure AWS CLI please refer to: Configuring the AWS CLI. |
region_name |
Default region when creating new connections. Please refer to link for
AWS region codes (region code example: Region = EU (Ireland) |
role_arn |
The Amazon Resource Name (ARN) of the role to assume (such as |
role_session_name |
An identifier for the assumed role session. By default 'RAthena' creates a session name |
duration_seconds |
The duration, in seconds, of the role session. The value can range from 900 seconds (15 minutes) up to the maximum session duration setting for the role. This setting can have a value from 1 hour to 12 hours. By default duration is set to 3600 seconds (1 hour). |
set_env |
If set to |
assume_role()
returns a list containing: "AccessKeyId"
, "SecretAccessKey"
, "SessionToken"
and "Expiration"
## Not run: # Note: # - Require AWS Account to run below example. library(RAthena) library(DBI) # Assuming demo ARN role assume_role(profile_name = "YOUR_PROFILE_NAME", role_arn = "arn:aws:sts::123456789012:assumed-role/role_name/role_session_name", set_env = TRUE) # Connect to Athena using ARN Role con <- dbConnect(RAthena::athena()) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. library(RAthena) library(DBI) # Assuming demo ARN role assume_role(profile_name = "YOUR_PROFILE_NAME", role_arn = "arn:aws:sts::123456789012:assumed-role/role_name/role_session_name", set_env = TRUE) # Connect to Athena using ARN Role con <- dbConnect(RAthena::athena()) ## End(Not run)
Driver for an Athena Boto3 connection.
athena()
athena()
athena()
returns a s4 class. This class is used active Athena method for dbConnect
RAthena::athena()
RAthena::athena()
Convenience functions for reading/writing DBMS tables
## S4 method for signature 'AthenaConnection,character,data.frame' dbWriteTable( conn, name, value, overwrite = FALSE, append = FALSE, row.names = NA, field.types = NULL, partition = NULL, s3.location = NULL, file.type = c("tsv", "csv", "parquet", "json"), compress = FALSE, max.batch = Inf, ... ) ## S4 method for signature 'AthenaConnection,Id,data.frame' dbWriteTable( conn, name, value, overwrite = FALSE, append = FALSE, row.names = NA, field.types = NULL, partition = NULL, s3.location = NULL, file.type = c("tsv", "csv", "parquet", "json"), compress = FALSE, max.batch = Inf, ... ) ## S4 method for signature 'AthenaConnection,SQL,data.frame' dbWriteTable( conn, name, value, overwrite = FALSE, append = FALSE, row.names = NA, field.types = NULL, partition = NULL, s3.location = NULL, file.type = c("tsv", "csv", "parquet", "json"), compress = FALSE, max.batch = Inf, ... )
## S4 method for signature 'AthenaConnection,character,data.frame' dbWriteTable( conn, name, value, overwrite = FALSE, append = FALSE, row.names = NA, field.types = NULL, partition = NULL, s3.location = NULL, file.type = c("tsv", "csv", "parquet", "json"), compress = FALSE, max.batch = Inf, ... ) ## S4 method for signature 'AthenaConnection,Id,data.frame' dbWriteTable( conn, name, value, overwrite = FALSE, append = FALSE, row.names = NA, field.types = NULL, partition = NULL, s3.location = NULL, file.type = c("tsv", "csv", "parquet", "json"), compress = FALSE, max.batch = Inf, ... ) ## S4 method for signature 'AthenaConnection,SQL,data.frame' dbWriteTable( conn, name, value, overwrite = FALSE, append = FALSE, row.names = NA, field.types = NULL, partition = NULL, s3.location = NULL, file.type = c("tsv", "csv", "parquet", "json"), compress = FALSE, max.batch = Inf, ... )
conn |
An |
name |
A character string specifying a table name. Names will be automatically quoted so you can use any sequence of characters, not just any valid bare table name. |
value |
A data.frame to write to the database. |
overwrite |
Allows overwriting the destination table. Cannot be |
append |
Allow appending to the destination table. Cannot be
|
row.names |
Either If A string is equivalent to For backward compatibility, |
field.types |
Additional field types used to override derived types. |
partition |
Partition Athena table (needs to be a named list or vector) for example: |
s3.location |
s3 bucket to store Athena table, must be set as a s3 uri for example ("s3://mybucket/data/").
By default, the s3.location is set to s3 staging directory from |
file.type |
What file type to store data.frame on s3, RAthena currently supports ["tsv", "csv", "parquet", "json"]. Default delimited file type is "tsv", in previous versions
of |
compress |
|
max.batch |
Split the data frame by max number of rows i.e. 100,000 so that multiple files can be uploaded into AWS S3. By default when compression
is set to |
... |
Other arguments used by individual methods. |
dbWriteTable()
returns TRUE
, invisibly. If the table exists, and both append and overwrite
arguments are unset, or append = TRUE and the data frame with the new data has different column names,
an error is raised; the remote table remains unchanged.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # List existing tables in Athena dbListTables(con) # Write data.frame to Athena table dbWriteTable(con, "mtcars", mtcars, partition=c("TIMESTAMP" = format(Sys.Date(), "%Y%m%d")), s3.location = "s3://mybucket/data/") # Read entire table from Athena dbReadTable(con, "mtcars") # List all tables in Athena after uploading new table to Athena dbListTables(con) # Checking if uploaded table exists in Athena dbExistsTable(con, "mtcars") # using default s3.location dbWriteTable(con, "iris", iris) # Read entire table from Athena dbReadTable(con, "iris") # List all tables in Athena after uploading new table to Athena dbListTables(con) # Checking if uploaded table exists in Athena dbExistsTable(con, "iris") # Disconnect from Athena dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # List existing tables in Athena dbListTables(con) # Write data.frame to Athena table dbWriteTable(con, "mtcars", mtcars, partition=c("TIMESTAMP" = format(Sys.Date(), "%Y%m%d")), s3.location = "s3://mybucket/data/") # Read entire table from Athena dbReadTable(con, "mtcars") # List all tables in Athena after uploading new table to Athena dbListTables(con) # Checking if uploaded table exists in Athena dbExistsTable(con, "mtcars") # using default s3.location dbWriteTable(con, "iris", iris) # Read entire table from Athena dbReadTable(con, "iris") # List all tables in Athena after uploading new table to Athena dbListTables(con) # Checking if uploaded table exists in Athena dbExistsTable(con, "iris") # Disconnect from Athena dbDisconnect(con) ## End(Not run)
These functions are used to build the different types of SQL queries. The AWS Athena implementation give extra parameters to allow access the to standard DBI Athena methods. They also utilise AWS Glue to speed up sql query execution.
db_explain.AthenaConnection(con, sql, ...) db_query_fields.AthenaConnection(con, sql, ...)
db_explain.AthenaConnection(con, sql, ...) db_query_fields.AthenaConnection(con, sql, ...)
con |
A |
sql |
SQL code to be sent to AWS Athena |
... |
other parameters, currently not implemented |
Returns AWS Athena explain statement
Returns sql query column names
These functions are used to build the different types of SQL queries. The AWS Athena implementation give extra parameters to allow access the to standard DBI Athena methods. They also utilise AWS Glue to speed up sql query execution.
sql_query_explain.AthenaConnection(con, sql, format = "text", type = NULL, ...) sql_query_fields.AthenaConnection(con, sql, ...) sql_escape_date.AthenaConnection(con, x) sql_escape_datetime.AthenaConnection(con, x)
sql_query_explain.AthenaConnection(con, sql, format = "text", type = NULL, ...) sql_query_fields.AthenaConnection(con, sql, ...) sql_escape_date.AthenaConnection(con, x) sql_escape_datetime.AthenaConnection(con, x)
con |
A |
sql |
SQL code to be sent to AWS Athena |
format |
returning format for explain queries, default set to '"text"'. Other formats can be found: https://docs.aws.amazon.com/athena/latest/ug/athena-explain-statement.html |
type |
return plan for explain queries, default set to 'NULL'. Other type can be found: https://docs.aws.amazon.com/athena/latest/ug/athena-explain-statement.html |
... |
other parameters, currently not implemented |
x |
R object to be transformed into athena equivalent |
Returns sql query for AWS Athena explain statement
Returns sql query column names
Returns sql escaping from dates
Returns sql escaping from date times
db_compute
for AthenaThis is a backend function for dplyr's compute
function. Users won't be required to access and run this function.
db_compute.AthenaConnection(con, table, sql, ...)
db_compute.AthenaConnection(con, table, sql, ...)
con |
A |
table |
Table name, if left default RAthena will use the default from |
sql |
SQL code to be sent to the data |
... |
passes
|
db_compute
returns table name
AthenaWriteTables
backend_dbplyr_v2
backend_dbplyr_v1
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documentation library(DBI) library(dplyr) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Write data.frame to Athena table copy_to(con, mtcars, s3_location = "s3://mybucket/data/") # Write Athena table from tbl_sql athena_mtcars <- tbl(con, "mtcars") mtcars_filter <- athena_mtcars %>% filter(gear >=4) # create athena with unique table name mtcars_filer %>% compute() # create athena with specified name and s3 location mtcars_filer %>% compute("mtcars_filer", s3_location = "s3://mybucket/mtcars_filer/") # Disconnect from Athena dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documentation library(DBI) library(dplyr) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Write data.frame to Athena table copy_to(con, mtcars, s3_location = "s3://mybucket/data/") # Write Athena table from tbl_sql athena_mtcars <- tbl(con, "mtcars") mtcars_filter <- athena_mtcars %>% filter(gear >=4) # create athena with unique table name mtcars_filer %>% compute() # create athena with specified name and s3 location mtcars_filer %>% compute("mtcars_filer", s3_location = "s3://mybucket/mtcars_filer/") # Disconnect from Athena dbDisconnect(con) ## End(Not run)
db_connection_describe
for Athena (api version 2).This is a backend function for dplyr to retrieve meta data about Athena queries. Users won't be required to access and run this function.
db_connection_describe.AthenaConnection(con)
db_connection_describe.AthenaConnection(con)
con |
A |
Character variable containing Meta Data about query sent to Athena. The Meta Data is returned in the following format:
"Athena <boto3 version> [<profile_name>@region/database]"
db_copy_to
for AthenaThis is an Athena method for dbplyr function db_copy_to
to create an Athena table from a data.frame
.
db_copy_to.AthenaConnection( con, table, values, overwrite = FALSE, append = FALSE, types = NULL, partition = NULL, s3_location = NULL, file_type = c("csv", "tsv", "parquet"), compress = FALSE, max_batch = Inf, ... )
db_copy_to.AthenaConnection( con, table, values, overwrite = FALSE, append = FALSE, types = NULL, partition = NULL, s3_location = NULL, file_type = c("csv", "tsv", "parquet"), compress = FALSE, max_batch = Inf, ... )
con |
A |
table |
A character string specifying a table name. Names will be automatically quoted so you can use any sequence of characters, not just any valid bare table name. |
values |
A data.frame to write to the database. |
overwrite |
Allows overwriting the destination table. Cannot be |
append |
Allow appending to the destination table. Cannot be |
types |
Additional field types used to override derived types. |
partition |
Partition Athena table (needs to be a named list or vector) for example: |
s3_location |
s3 bucket to store Athena table, must be set as a s3 uri for example ("s3://mybucket/data/") |
file_type |
What file type to store data.frame on s3, RAthena currently supports ["tsv", "csv", "parquet"]. Default delimited file type is "tsv", in previous versions
of |
compress |
|
max_batch |
Split the data frame by max number of rows i.e. 100,000 so that multiple files can be uploaded into AWS S3. By default when compression
is set to |
... |
other parameters currently not supported in RAthena |
db_copy_to returns table name
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) library(dplyr) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # List existing tables in Athena dbListTables(con) # Write data.frame to Athena table copy_to(con, mtcars, s3_location = "s3://mybucket/data/") # Checking if uploaded table exists in Athena dbExistsTable(con, "mtcars") # Write Athena table from tbl_sql athena_mtcars <- tbl(con, "mtcars") mtcars_filter <- athena_mtcars %>% filter(gear >=4) copy_to(con, mtcars_filter) # Checking if uploaded table exists in Athena dbExistsTable(con, "mtcars_filter") # Disconnect from Athena dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) library(dplyr) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # List existing tables in Athena dbListTables(con) # Write data.frame to Athena table copy_to(con, mtcars, s3_location = "s3://mybucket/data/") # Checking if uploaded table exists in Athena dbExistsTable(con, "mtcars") # Write Athena table from tbl_sql athena_mtcars <- tbl(con, "mtcars") mtcars_filter <- athena_mtcars %>% filter(gear >=4) copy_to(con, mtcars_filter) # Checking if uploaded table exists in Athena dbExistsTable(con, "mtcars_filter") # Disconnect from Athena dbDisconnect(con) ## End(Not run)
db_desc
for Athena (api version 1).This is a backend function for dplyr to retrieve meta data about Athena queries. Users won't be required to access and run this function.
db_desc.AthenaConnection(x)
db_desc.AthenaConnection(x)
x |
A |
Character variable containing Meta Data about query sent to Athena. The Meta Data is returned in the following format:
"Athena <boto3 version> [<profile_name>@region/database]"
Frees all resources (local and Athena) associated with result set. It does this by removing query output in AWS S3 Bucket, stopping query execution if still running and removed the connection resource locally.
## S4 method for signature 'AthenaResult' dbClearResult(res, ...)
## S4 method for signature 'AthenaResult' dbClearResult(res, ...)
res |
An object inheriting from DBIResult. |
... |
Other arguments passed on to methods. |
dbClearResult()
returns TRUE
, invisibly.
If the user does not have permission to remove AWS S3 resource from AWS Athena output location, then an AWS warning will be returned.
It is better use query caching or optionally prevent clear AWS S3 resource using RAthena_options
so that the warning doesn't repeatedly show.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) res <- dbSendQuery(con, "show databases") dbClearResult(res) # Check if connection if valid after closing connection dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) res <- dbSendQuery(con, "show databases") dbClearResult(res) # Check if connection if valid after closing connection dbDisconnect(con) ## End(Not run)
Produces a data.frame that describes the output of a query.
## S4 method for signature 'AthenaResult' dbColumnInfo(res, ...)
## S4 method for signature 'AthenaResult' dbColumnInfo(res, ...)
res |
An object inheriting from DBIResult. |
... |
Other arguments passed on to methods. |
dbColumnInfo()
returns a data.frame with as many rows as there are output fields in the result.
The data.frame has two columns (field_name, type).
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Get Column information from query res <- dbSendQuery(con, "select * from information_schema.tables") dbColumnInfo(res) dbClearResult(res) # Disconnect from Athena dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Get Column information from query res <- dbSendQuery(con, "select * from information_schema.tables") dbColumnInfo(res) dbClearResult(res) # Disconnect from Athena dbDisconnect(con) ## End(Not run)
It is never advised to hard-code credentials when making a connection to Athena (even though the option is there). Instead it is advised to use
profile_name
(set up by AWS Command Line Interface),
Amazon Resource Name roles or environmental variables. Here is a list
of supported environment variables:
AWS_ACCESS_KEY_ID: is equivalent to the dbConnect
parameter - aws_access_key_id
AWS_SECRET_ACCESS_KEY: is equivalent to the dbConnect
parameter - aws_secret_access_key
AWS_SESSION_TOKEN: is equivalent to the dbConnect
parameter - aws_session_token
AWS_EXPIRATION: is equivalent to the dbConnect
parameter - duration_seconds
AWS_ATHENA_S3_STAGING_DIR: is equivalent to the dbConnect
parameter - s3_staging_dir
AWS_ATHENA_WORK_GROUP: is equivalent to dbConnect
parameter - work_group
AWS_REGION: is equivalent to dbConnect
parameter - region_name
NOTE: If you have set any environmental variables in .Renviron
please restart your R in order for the changes to take affect.
## S4 method for signature 'AthenaDriver' dbConnect( drv, aws_access_key_id = NULL, aws_secret_access_key = NULL, aws_session_token = NULL, schema_name = "default", work_group = NULL, poll_interval = NULL, encryption_option = c("NULL", "SSE_S3", "SSE_KMS", "CSE_KMS"), kms_key = NULL, profile_name = NULL, role_arn = NULL, role_session_name = sprintf("RAthena-session-%s", as.integer(Sys.time())), duration_seconds = 3600L, s3_staging_dir = NULL, region_name = NULL, botocore_session = NULL, bigint = c("integer64", "integer", "numeric", "character"), binary = c("raw", "character"), json = c("auto", "character"), timezone = "UTC", keyboard_interrupt = TRUE, rstudio_conn_tab = TRUE, endpoint_override = NULL, ... )
## S4 method for signature 'AthenaDriver' dbConnect( drv, aws_access_key_id = NULL, aws_secret_access_key = NULL, aws_session_token = NULL, schema_name = "default", work_group = NULL, poll_interval = NULL, encryption_option = c("NULL", "SSE_S3", "SSE_KMS", "CSE_KMS"), kms_key = NULL, profile_name = NULL, role_arn = NULL, role_session_name = sprintf("RAthena-session-%s", as.integer(Sys.time())), duration_seconds = 3600L, s3_staging_dir = NULL, region_name = NULL, botocore_session = NULL, bigint = c("integer64", "integer", "numeric", "character"), binary = c("raw", "character"), json = c("auto", "character"), timezone = "UTC", keyboard_interrupt = TRUE, rstudio_conn_tab = TRUE, endpoint_override = NULL, ... )
drv |
an object that inherits from DBIDriver, or an existing DBIConnection object (in order to clone an existing connection). |
aws_access_key_id |
AWS access key ID |
aws_secret_access_key |
AWS secret access key |
aws_session_token |
AWS temporary session token |
schema_name |
The schema_name to which the connection belongs |
work_group |
The name of the work group to run Athena queries , Currently defaulted to |
poll_interval |
Amount of time took when checking query execution status. Default set to a random interval between 0.5 - 1 seconds. |
encryption_option |
Athena encryption at rest link. Supported Amazon S3 Encryption Options ["NULL", "SSE_S3", "SSE_KMS", "CSE_KMS"]. Connection will default to NULL, usually changing this option is not required. |
kms_key |
AWS Key Management Service, please refer to link for more information around the concept. |
profile_name |
The name of a profile to use. If not given, then the default profile is used. To set profile name, the AWS Command Line Interface (AWS CLI) will need to be configured. To configure AWS CLI please refer to: Configuring the AWS CLI. |
role_arn |
The Amazon Resource Name (ARN) of the role to assume (such as |
role_session_name |
An identifier for the assumed role session. By default 'RAthena' creates a session name |
duration_seconds |
The duration, in seconds, of the role session. The value can range from 900 seconds (15 minutes) up to the maximum session duration setting for the role. This setting can have a value from 1 hour to 12 hours. By default duration is set to 3600 seconds (1 hour). |
s3_staging_dir |
The location in Amazon S3 where your query results are stored, such as |
region_name |
Default region when creating new connections. Please refer to link for
AWS region codes (region code example: Region = EU (Ireland) |
botocore_session |
Use this Botocore session instead of creating a new default one. |
bigint |
The R type that 64-bit integer types should be mapped to, default is [bit64::integer64], which allows the full range of 64 bit integers. |
binary |
The R type that [binary/varbinary] types should be mapped to, default is [raw]. If the mapping fails R will resort to [character] type. To ignore data type conversion set to ["character"]. |
json |
Attempt to converts AWS Athena data types [arrays, json] using |
timezone |
Sets the timezone for the connection. The default is 'UTC'. If ‘NULL' then no timezone is set, which defaults to the server’s time zone. 'AWS Athena' accepted time zones: https://docs.aws.amazon.com/athena/latest/ug/athena-supported-time-zones.html. |
keyboard_interrupt |
Stops AWS Athena process when R gets a keyboard interrupt, currently defaults to |
rstudio_conn_tab |
Optional to get AWS Athena Schema from AWS Glue Catalogue and display it in RStudio's Connections Tab.
Default set to |
endpoint_override |
(character/list) The complete URL to use for the constructed client. Normally,
|
... |
Passes parameters to
|
dbConnect()
returns a s4 class. This object is used to communicate with AWS Athena.
## Not run: # Connect to Athena using your aws access keys library(DBI) con <- dbConnect(RAthena::athena(), aws_access_key_id='YOUR_ACCESS_KEY_ID', # aws_secret_access_key='YOUR_SECRET_ACCESS_KEY', s3_staging_dir='s3://path/to/query/bucket/', region_name='us-west-2') dbDisconnect(con) # Connect to Athena using your profile name # Profile name can be created by using AWS CLI con <- dbConnect(RAthena::athena(), profile_name = "YOUR_PROFILE_NAME", s3_staging_dir = 's3://path/to/query/bucket/') dbDisconnect(con) # Connect to Athena using ARN role con <- dbConnect(RAthena::athena(), profile_name = "YOUR_PROFILE_NAME", role_arn = "arn:aws:sts::123456789012:assumed-role/role_name/role_session_name", s3_staging_dir = 's3://path/to/query/bucket/') dbDisconnect(con) ## End(Not run)
## Not run: # Connect to Athena using your aws access keys library(DBI) con <- dbConnect(RAthena::athena(), aws_access_key_id='YOUR_ACCESS_KEY_ID', # aws_secret_access_key='YOUR_SECRET_ACCESS_KEY', s3_staging_dir='s3://path/to/query/bucket/', region_name='us-west-2') dbDisconnect(con) # Connect to Athena using your profile name # Profile name can be created by using AWS CLI con <- dbConnect(RAthena::athena(), profile_name = "YOUR_PROFILE_NAME", s3_staging_dir = 's3://path/to/query/bucket/') dbDisconnect(con) # Connect to Athena using ARN role con <- dbConnect(RAthena::athena(), profile_name = "YOUR_PROFILE_NAME", role_arn = "arn:aws:sts::123456789012:assumed-role/role_name/role_session_name", s3_staging_dir = 's3://path/to/query/bucket/') dbDisconnect(con) ## End(Not run)
Utilises AWS Athena to convert AWS S3 backend file types. It also also to create more efficient file types i.e. "parquet" and "orc" from SQL queries.
dbConvertTable(conn, obj, name, ...) ## S4 method for signature 'AthenaConnection' dbConvertTable( conn, obj, name, partition = NULL, s3.location = NULL, file.type = c("NULL", "csv", "tsv", "parquet", "json", "orc"), compress = TRUE, data = TRUE, ... )
dbConvertTable(conn, obj, name, ...) ## S4 method for signature 'AthenaConnection' dbConvertTable( conn, obj, name, partition = NULL, s3.location = NULL, file.type = c("NULL", "csv", "tsv", "parquet", "json", "orc"), compress = TRUE, data = TRUE, ... )
conn |
An |
obj |
Athena table or |
name |
Name of destination table |
... |
Extra parameters, currently not used |
partition |
Partition Athena table |
s3.location |
location to store output file, must be in s3 uri format for example ("s3://mybucket/data/"). |
file.type |
File type for |
compress |
Compress |
data |
If |
dbConvertTable()
returns TRUE
but invisible.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) library(RAthena) # Demo connection to Athena using profile name con <- dbConnect(athena()) # write iris table to Athena in defualt delimited format dbWriteTable(con, "iris", iris) # convert delimited table to parquet dbConvertTable(con, obj = "iris", name = "iris_parquet", file.type = "parquet" ) # Create partitioned table from non-partitioned # iris table using SQL DML query dbConvertTable(con, obj = SQL("select iris.*, date_format(current_date, '%Y%m%d') as time_stamp from iris"), name = "iris_orc_partitioned", file.type = "orc", partition = "time_stamp" ) # disconnect from Athena dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) library(RAthena) # Demo connection to Athena using profile name con <- dbConnect(athena()) # write iris table to Athena in defualt delimited format dbWriteTable(con, "iris", iris) # convert delimited table to parquet dbConvertTable(con, obj = "iris", name = "iris_parquet", file.type = "parquet" ) # Create partitioned table from non-partitioned # iris table using SQL DML query dbConvertTable(con, obj = SQL("select iris.*, date_format(current_date, '%Y%m%d') as time_stamp from iris"), name = "iris_orc_partitioned", file.type = "orc", partition = "time_stamp" ) # disconnect from Athena dbDisconnect(con) ## End(Not run)
Returns a character string that describes the Athena SQL data type for the obj
object.
## S4 method for signature 'AthenaDriver,ANY' dbDataType(dbObj, obj, ...) ## S4 method for signature 'AthenaDriver,list' dbDataType(dbObj, obj, ...) ## S4 method for signature 'AthenaConnection,ANY' dbDataType(dbObj, obj, ...) ## S4 method for signature 'AthenaConnection,data.frame' dbDataType(dbObj, obj, ...)
## S4 method for signature 'AthenaDriver,ANY' dbDataType(dbObj, obj, ...) ## S4 method for signature 'AthenaDriver,list' dbDataType(dbObj, obj, ...) ## S4 method for signature 'AthenaConnection,ANY' dbDataType(dbObj, obj, ...) ## S4 method for signature 'AthenaConnection,data.frame' dbDataType(dbObj, obj, ...)
dbObj |
A object inheriting from DBIDriver or DBIConnection |
obj |
An R object whose SQL type we want to determine. |
... |
Other arguments passed on to methods. |
dbDataType
returns the Athena type that correspond to the obj argument as an non-empty character string.
library(RAthena) dbDataType(athena(), 1:5) dbDataType(athena(), 1) dbDataType(athena(), TRUE) dbDataType(athena(), Sys.Date()) dbDataType(athena(), Sys.time()) dbDataType(athena(), c("x", "abc")) dbDataType(athena(), list(raw(10), raw(20))) vapply(iris, function(x) dbDataType(RAthena::athena(), x), FUN.VALUE = character(1), USE.NAMES = TRUE ) ## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Sending Queries to Athena dbDataType(con, iris) # Disconnect conenction dbDisconnect(con) ## End(Not run)
library(RAthena) dbDataType(athena(), 1:5) dbDataType(athena(), 1) dbDataType(athena(), TRUE) dbDataType(athena(), Sys.Date()) dbDataType(athena(), Sys.time()) dbDataType(athena(), c("x", "abc")) dbDataType(athena(), list(raw(10), raw(20))) vapply(iris, function(x) dbDataType(RAthena::athena(), x), FUN.VALUE = character(1), USE.NAMES = TRUE ) ## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Sending Queries to Athena dbDataType(con, iris) # Disconnect conenction dbDisconnect(con) ## End(Not run)
This closes the connection to Athena.
## S4 method for signature 'AthenaConnection' dbDisconnect(conn, ...)
## S4 method for signature 'AthenaConnection' dbDisconnect(conn, ...)
conn |
A DBIConnection object, as returned by
|
... |
Other parameters passed on to methods. |
dbDisconnect()
returns TRUE
, invisibly.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Disconnect conenction dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Disconnect conenction dbDisconnect(con) ## End(Not run)
Returns logical scalar if the table exists or not. TRUE
if the table exists, FALSE
otherwise.
## S4 method for signature 'AthenaConnection,character' dbExistsTable(conn, name, ...)
## S4 method for signature 'AthenaConnection,character' dbExistsTable(conn, name, ...)
conn |
A DBIConnection object, as returned by
|
name |
The table name, passed on to
|
... |
Other parameters passed on to methods. |
dbExistsTable()
returns logical scalar. TRUE
if the table exists, FALSE
otherwise.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Write data.frame to Athena table dbWriteTable(con, "mtcars", mtcars, partition = c("TIMESTAMP" = format(Sys.Date(), "%Y%m%d")), s3.location = "s3://mybucket/data/" ) # Check if table exists from Athena dbExistsTable(con, "mtcars") # Disconnect conenction dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Write data.frame to Athena table dbWriteTable(con, "mtcars", mtcars, partition = c("TIMESTAMP" = format(Sys.Date(), "%Y%m%d")), s3.location = "s3://mybucket/data/" ) # Check if table exists from Athena dbExistsTable(con, "mtcars") # Disconnect conenction dbDisconnect(con) ## End(Not run)
Currently returns the top n elements (rows) from result set or returns entire table from Athena.
## S4 method for signature 'AthenaResult' dbFetch(res, n = -1, ...)
## S4 method for signature 'AthenaResult' dbFetch(res, n = -1, ...)
res |
An object inheriting from DBIResult, created by
|
n |
maximum number of records to retrieve per fetch. Use |
... |
Other arguments passed on to methods. |
dbFetch()
returns a data frame.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) res <- dbSendQuery(con, "show databases") dbFetch(res) dbClearResult(res) # Disconnect from Athena dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) res <- dbSendQuery(con, "show databases") dbFetch(res) dbClearResult(res) # Disconnect from Athena dbDisconnect(con) ## End(Not run)
Get DBMS metadata
## S4 method for signature 'AthenaConnection' dbGetInfo(dbObj, ...) ## S4 method for signature 'AthenaResult' dbGetInfo(dbObj, ...)
## S4 method for signature 'AthenaConnection' dbGetInfo(dbObj, ...) ## S4 method for signature 'AthenaResult' dbGetInfo(dbObj, ...)
dbObj |
An object inheriting from DBIObject, i.e. DBIDriver, DBIConnection, or a DBIResult |
... |
Other arguments to methods. |
a named list
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Returns metadata from connnection object metadata <- dbGetInfo(con) # Return metadata from Athena query object res <- dbSendQuery(con, "show databases") dbGetInfo(res) # Clear result dbClearResult(res) # disconnect from Athena dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Returns metadata from connnection object metadata <- dbGetInfo(con) # Return metadata from Athena query object res <- dbSendQuery(con, "show databases") dbGetInfo(res) # Clear result dbClearResult(res) # disconnect from Athena dbDisconnect(con) ## End(Not run)
This method returns all partitions from Athena table.
dbGetPartition(conn, name, ..., .format = FALSE) ## S4 method for signature 'AthenaConnection' dbGetPartition(conn, name, ..., .format = FALSE)
dbGetPartition(conn, name, ..., .format = FALSE) ## S4 method for signature 'AthenaConnection' dbGetPartition(conn, name, ..., .format = FALSE)
conn |
A DBIConnection object, as returned by
|
name |
The table name, passed on to
|
... |
Other parameters passed on to methods. |
.format |
re-formats AWS Athena partitions format. So that each column represents a partition
from the AWS Athena table. Default set to |
data.frame that returns all partitions in table, if no partitions in Athena table then function will return error from Athena.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # write iris table to Athena dbWriteTable(con, "iris", iris, partition = c("timestamp" = format(Sys.Date(), "%Y%m%d")), s3.location = "s3://path/to/store/athena/table/" ) # return table partitions RAthena::dbGetPartition(con, "iris") # disconnect from Athena dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # write iris table to Athena dbWriteTable(con, "iris", iris, partition = c("timestamp" = format(Sys.Date(), "%Y%m%d")), s3.location = "s3://path/to/store/athena/table/" ) # return table partitions RAthena::dbGetPartition(con, "iris") # disconnect from Athena dbDisconnect(con) ## End(Not run)
Send query, retrieve results and then clear result set
## S4 method for signature 'AthenaConnection,character' dbGetQuery(conn, statement, statistics = FALSE, unload = athena_unload(), ...)
## S4 method for signature 'AthenaConnection,character' dbGetQuery(conn, statement, statistics = FALSE, unload = athena_unload(), ...)
conn |
A DBIConnection object, as returned by
|
statement |
a character string containing SQL. |
statistics |
If set to |
unload |
boolean input to modify 'statement' to align with AWS Athena UNLOAD,
default is set to |
... |
Other parameters passed on to methods. |
dbGetQuery()
returns a dataframe.
If the user does not have permission to remove AWS S3 resource from AWS Athena output location, then an AWS warning will be returned.
For example AccessDenied (HTTP 403). Access Denied
.
It is better use query caching or optionally prevent clear AWS S3 resource using RAthena_options
so that the warning doesn't repeatedly show.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Sending Queries to Athena dbGetQuery(con, "show databases") # Disconnect conenction dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Sending Queries to Athena dbGetQuery(con, "show databases") # Disconnect conenction dbDisconnect(con) ## End(Not run)
Returns the statement that was passed to [dbSendQuery()] or [dbSendStatement()].
## S4 method for signature 'AthenaResult' dbGetStatement(res, ...)
## S4 method for signature 'AthenaResult' dbGetStatement(res, ...)
res |
An object inheriting from DBIResult. |
... |
Other arguments passed on to methods. |
dbGetStatement()
returns a character.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) rs <- dbSendQuery(con, "SHOW TABLES in default") dbGetStatement(rs) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) rs <- dbSendQuery(con, "SHOW TABLES in default") dbGetStatement(rs) ## End(Not run)
Method to get Athena schema, tables and table types return as a data.frame
dbGetTables(conn, ...) ## S4 method for signature 'AthenaConnection' dbGetTables(conn, schema = NULL, ...)
dbGetTables(conn, ...) ## S4 method for signature 'AthenaConnection' dbGetTables(conn, schema = NULL, ...)
conn |
A DBIConnection object, as returned by
|
... |
Other parameters passed on to methods. |
schema |
Athena schema, default set to NULL to return all tables from all Athena schemas. Note: The use of DATABASE and SCHEMA is interchangeable within Athena. |
dbGetTables()
returns a data.frame.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) library(RAthena) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Return hierarchy of tables in Athena dbGetTables(con) # Disconnect conenction dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) library(RAthena) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Return hierarchy of tables in Athena dbGetTables(con) # Disconnect conenction dbDisconnect(con) ## End(Not run)
This method returns if the query has completed.
## S4 method for signature 'AthenaResult' dbHasCompleted(res, ...)
## S4 method for signature 'AthenaResult' dbHasCompleted(res, ...)
res |
An object inheriting from DBIResult. |
... |
Other arguments passed on to methods. |
dbHasCompleted()
returns a logical scalar. TRUE
if the query has completed, FALSE
otherwise.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Check if query has completed res <- dbSendQuery(con, "show databases") dbHasCompleted(res) dbClearResult(res) # Disconnect from Athena dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Check if query has completed res <- dbSendQuery(con, "show databases") dbHasCompleted(res) dbClearResult(res) # Disconnect from Athena dbDisconnect(con) ## End(Not run)
This method tests whether the dbObj
is still valid.
## S4 method for signature 'AthenaConnection' dbIsValid(dbObj, ...) ## S4 method for signature 'AthenaResult' dbIsValid(dbObj, ...)
## S4 method for signature 'AthenaConnection' dbIsValid(dbObj, ...) ## S4 method for signature 'AthenaResult' dbIsValid(dbObj, ...)
dbObj |
An object inheriting from DBIObject, i.e. DBIDriver, DBIConnection, or a DBIResult |
... |
Other arguments to methods. |
dbIsValid()
returns logical scalar, TRUE
if the object (dbObj
) is valid, FALSE
otherwise.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Check is connection is valid dbIsValid(con) # Check is query is valid res <- dbSendQuery(con, "show databases") dbIsValid(res) # Check if query is valid after clearing result dbClearResult(res) dbIsValid(res) # Check if connection if valid after closing connection dbDisconnect(con) dbIsValid(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Check is connection is valid dbIsValid(con) # Check is query is valid res <- dbSendQuery(con, "show databases") dbIsValid(res) # Check if query is valid after clearing result dbClearResult(res) dbIsValid(res) # Check if connection if valid after closing connection dbDisconnect(con) dbIsValid(con) ## End(Not run)
List Field names of Athena table
## S4 method for signature 'AthenaConnection,character' dbListFields(conn, name, ...)
## S4 method for signature 'AthenaConnection,character' dbListFields(conn, name, ...)
conn |
A DBIConnection object, as returned by
|
name |
The table name, passed on to
|
... |
Other parameters passed on to methods. |
dbListFields()
returns a character vector with all the fields from an Athena table.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Write data.frame to Athena table dbWriteTable(con, "mtcars", mtcars, partition = c("TIMESTAMP" = format(Sys.Date(), "%Y%m%d")), s3.location = "s3://mybucket/data/" ) # Return list of fields in table dbListFields(con, "mtcars") # Disconnect conenction dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Write data.frame to Athena table dbWriteTable(con, "mtcars", mtcars, partition = c("TIMESTAMP" = format(Sys.Date(), "%Y%m%d")), s3.location = "s3://mybucket/data/" ) # Return list of fields in table dbListFields(con, "mtcars") # Disconnect conenction dbDisconnect(con) ## End(Not run)
Returns the unquoted names of Athena tables accessible through this connection.
## S4 method for signature 'AthenaConnection' dbListTables(conn, schema = NULL, ...)
## S4 method for signature 'AthenaConnection' dbListTables(conn, schema = NULL, ...)
conn |
A DBIConnection object, as returned by
|
schema |
Athena schema, default set to NULL to return all tables from all Athena schemas. Note: The use of DATABASE and SCHEMA is interchangeable within Athena. |
... |
Other parameters passed on to methods. |
dbListTables()
returns a character vector with all the tables from Athena.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Return list of tables in Athena dbListTables(con) # Disconnect conenction dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Return list of tables in Athena dbListTables(con) # Disconnect conenction dbDisconnect(con) ## End(Not run)
Declare which version of dbplyr API is being called.
dbplyr_edition.AthenaConnection(con)
dbplyr_edition.AthenaConnection(con)
con |
A |
Integer for which version of 'dbplyr' is going to be used.
Call this method to generate string that is suitable for use in a query as a column or table name.
## S4 method for signature 'AthenaConnection,character' dbQuoteString(conn, x, ...) ## S4 method for signature 'AthenaConnection,POSIXct' dbQuoteString(conn, x, ...) ## S4 method for signature 'AthenaConnection,Date' dbQuoteString(conn, x, ...) ## S4 method for signature 'AthenaConnection,SQL' dbQuoteIdentifier(conn, x, ...)
## S4 method for signature 'AthenaConnection,character' dbQuoteString(conn, x, ...) ## S4 method for signature 'AthenaConnection,POSIXct' dbQuoteString(conn, x, ...) ## S4 method for signature 'AthenaConnection,Date' dbQuoteString(conn, x, ...) ## S4 method for signature 'AthenaConnection,SQL' dbQuoteIdentifier(conn, x, ...)
conn |
A DBIConnection object, as returned by
|
x |
A character vector to quote as string. |
... |
Other arguments passed on to methods. |
Returns a character object, for more information please check out: dbQuoteString
, dbQuoteIdentifier
dbQuoteString
, dbQuoteIdentifier
Removes Athena table but does not remove the data from Amazon S3 bucket.
## S4 method for signature 'AthenaConnection,character' dbRemoveTable(conn, name, delete_data = TRUE, confirm = FALSE, ...)
## S4 method for signature 'AthenaConnection,character' dbRemoveTable(conn, name, delete_data = TRUE, confirm = FALSE, ...)
conn |
A DBIConnection object, as returned by
|
name |
The table name, passed on to
|
delete_data |
Deletes S3 files linking to AWS Athena table |
confirm |
Allows for S3 files to be deleted without the prompt check. It is recommend to leave this set to |
... |
Other parameters passed on to methods. |
dbRemoveTable()
returns TRUE
, invisibly.
If you are having difficulty removing AWS S3 files please check if the AWS S3 location following AWS best practises: Table Location in Amazon S3
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Write data.frame to Athena table dbWriteTable(con, "mtcars", mtcars, partition = c("TIMESTAMP" = format(Sys.Date(), "%Y%m%d")), s3.location = "s3://mybucket/data/" ) # Remove Table from Athena dbRemoveTable(con, "mtcars") # Disconnect conenction dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Write data.frame to Athena table dbWriteTable(con, "mtcars", mtcars, partition = c("TIMESTAMP" = format(Sys.Date(), "%Y%m%d")), s3.location = "s3://mybucket/data/" ) # Remove Table from Athena dbRemoveTable(con, "mtcars") # Disconnect conenction dbDisconnect(con) ## End(Not run)
Executes a statement to return the data description language (DDL) of the Athena table.
dbShow(conn, name, ...) ## S4 method for signature 'AthenaConnection' dbShow(conn, name, ...)
dbShow(conn, name, ...) ## S4 method for signature 'AthenaConnection' dbShow(conn, name, ...)
conn |
A DBIConnection object, as returned by
|
name |
The table name, passed on to
|
... |
Other parameters passed on to methods. |
dbShow()
returns SQL
characters of the Athena table DDL.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # write iris table to Athena dbWriteTable(con, "iris", iris, partition = c("timestamp" = format(Sys.Date(), "%Y%m%d")), s3.location = "s3://path/to/store/athena/table/" ) # return table ddl RAthena::dbShow(con, "iris") # disconnect from Athena dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # write iris table to Athena dbWriteTable(con, "iris", iris, partition = c("timestamp" = format(Sys.Date(), "%Y%m%d")), s3.location = "s3://path/to/store/athena/table/" ) # return table ddl RAthena::dbShow(con, "iris") # disconnect from Athena dbDisconnect(con) ## End(Not run)
Returns AWS Athena Statistics from execute queries dbSendQuery
dbStatistics(res, ...) ## S4 method for signature 'AthenaResult' dbStatistics(res, ...)
dbStatistics(res, ...) ## S4 method for signature 'AthenaResult' dbStatistics(res, ...)
res |
An object inheriting from DBIResult. |
... |
Other arguments passed on to methods. |
dbStatistics()
returns list containing Athena Statistics return from boto3
.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) library(RAthena) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) res <- dbSendQuery(con, "show databases") dbStatistics(res) # Clean up dbClearResult(res) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) library(RAthena) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) res <- dbSendQuery(con, "show databases") dbStatistics(res) # Clean up dbClearResult(res) ## End(Not run)
Install Amazon SDK boto3 for Athena connection
install_boto( method = c("auto", "virtualenv", "conda"), conda = "auto", envname = "RAthena", conda_python_version = "3.7", ... )
install_boto( method = c("auto", "virtualenv", "conda"), conda = "auto", envname = "RAthena", conda_python_version = "3.7", ... )
method |
Installation method. By default, "auto" automatically finds a method that will work in the local environment. Change the default to force a specific installation method. Note that the "virtualenv" method is not available on Windows. Note also that since this command runs without privilege the "system" method is available only on Windows. |
conda |
The path to a |
envname |
Name of Python environment to install within, by default environment name RAthena. |
conda_python_version |
the python version installed in the created conda environment. Python 3.7 is installed by default. |
... |
other arguments passed to [reticulate::conda_install()] or [reticulate::virtualenv_install()]. |
Returns NULL
after installing Python
Boto3
.
[reticulate::use_python] or [reticulate::use_condaenv] might be required before connecting to Athena.
The dbSendQuery()
and dbSendStatement()
method submits a query to Athena but does not wait for query to execute.
dbHasCompleted
method will need to ran to check if query has been completed or not.
The dbExecute()
method submits a query to Athena and waits for the query to be executed.
## S4 method for signature 'AthenaConnection,character' dbSendQuery(conn, statement, unload = athena_unload(), ...) ## S4 method for signature 'AthenaConnection,character' dbSendStatement(conn, statement, unload = athena_unload(), ...) ## S4 method for signature 'AthenaConnection,character' dbExecute(conn, statement, unload = athena_unload(), ...)
## S4 method for signature 'AthenaConnection,character' dbSendQuery(conn, statement, unload = athena_unload(), ...) ## S4 method for signature 'AthenaConnection,character' dbSendStatement(conn, statement, unload = athena_unload(), ...) ## S4 method for signature 'AthenaConnection,character' dbExecute(conn, statement, unload = athena_unload(), ...)
conn |
A DBIConnection object, as returned by
|
statement |
a character string containing SQL. |
unload |
boolean input to modify 'statement' to align with AWS Athena UNLOAD,
default is set to |
... |
Other parameters passed on to methods. |
Returns AthenaResult
s4 class.
dbSendQuery
, dbSendStatement
, dbExecute
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Sending Queries to Athena res1 <- dbSendQuery(con, "show databases") res2 <- dbSendStatement(con, "show databases") res3 <- dbExecute(con, "show databases") # Disconnect conenction dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Sending Queries to Athena res1 <- dbSendQuery(con, "show databases") res2 <- dbSendStatement(con, "show databases") res3 <- dbExecute(con, "show databases") # Disconnect conenction dbDisconnect(con) ## End(Not run)
RAthena_options()
provides a method to change the backend. This includes changing the file parser,
whether RAthena
should cache query ids locally and number of retries on a failed api call.
RAthena_options( file_parser, bigint, binary, json, cache_size, clear_cache, retry, retry_quiet, unload, clear_s3_resource, verbose )
RAthena_options( file_parser, bigint, binary, json, cache_size, clear_cache, retry, retry_quiet, unload, clear_s3_resource, verbose )
file_parser |
Method to read and write tables to Athena, currently default to |
bigint |
The R type that 64-bit integer types should be mapped to (default: |
binary |
The R type that [binary/varbinary] types should be mapped to (default |
json |
Attempt to converts AWS Athena data types [arrays, json] using |
cache_size |
Number of queries to be cached. Currently only support caching up to 100 distinct queries (default: |
clear_cache |
Clears all previous cached query metadata |
retry |
Maximum number of requests to attempt (default: |
retry_quiet |
This method is deprecated please use verbose instead. |
unload |
set AWS Athena unload functionality globally (default: |
clear_s3_resource |
Clear down 'AWS Athena' 'AWS S3' resource ( |
verbose |
print package info messages (default: |
RAthena_options()
returns NULL
, invisibly.
library(RAthena) # change file parser from default data.table to vroom RAthena_options("vroom") # cache queries locally RAthena_options(cache_size = 5)
library(RAthena) # change file parser from default data.table to vroom RAthena_options("vroom") # cache queries locally RAthena_options(cache_size = 5)
Returns a set of temporary credentials for an AWS account or IAM user (link).
get_session_token( profile_name = NULL, region_name = NULL, serial_number = NULL, token_code = NULL, duration_seconds = 3600L, set_env = FALSE )
get_session_token( profile_name = NULL, region_name = NULL, serial_number = NULL, token_code = NULL, duration_seconds = 3600L, set_env = FALSE )
profile_name |
The name of a profile to use. If not given, then the default profile is used. To set profile name, the AWS Command Line Interface (AWS CLI) will need to be configured. To configure AWS CLI please refer to: Configuring the AWS CLI. |
region_name |
Default region when creating new connections. Please refer to link for
AWS region codes (region code example: Region = EU (Ireland) |
serial_number |
The identification number of the MFA device that is associated with the IAM user who is making the GetSessionToken call. Specify this value if the IAM user has a policy that requires MFA authentication. The value is either the serial number for a hardware device (such as 'GAHT12345678') or an Amazon Resource Name (ARN) for a virtual device (such as arn:aws:iam::123456789012:mfa/user). |
token_code |
The value provided by the MFA device, if MFA is required. If any policy requires the IAM user to submit an MFA code, specify this value. If MFA authentication is required, the user must provide a code when requesting a set of temporary security credentials. A user who fails to provide the code receives an "access denied" response when requesting resources that require MFA authentication. |
duration_seconds |
The duration, in seconds, that the credentials should remain valid. Acceptable duration for IAM user sessions range from 900 seconds (15 minutes) to 129,600 seconds (36 hours), with 3,600 seconds (1 hour) as the default. |
set_env |
If set to |
get_session_token()
returns a list containing: "AccessKeyId"
, "SecretAccessKey"
, "SessionToken"
and "Expiration"
## Not run: # Note: # - Require AWS Account to run below example. library(RAthena) library(DBI) # Create Temporary Credentials duration 1 hour get_session_token("YOUR_PROFILE_NAME", serial_number='arn:aws:iam::123456789012:mfa/user', token_code = "531602", set_env = TRUE) # Connect to Athena using temporary credentials con <- dbConnect(athena()) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. library(RAthena) library(DBI) # Create Temporary Credentials duration 1 hour get_session_token("YOUR_PROFILE_NAME", serial_number='arn:aws:iam::123456789012:mfa/user', token_code = "531602", set_env = TRUE) # Connect to Athena using temporary credentials con <- dbConnect(athena()) ## End(Not run)
Create s3 implementation of sql_translate_env
for AWS Athena sql translate environment based off
Athena Data Types and
DML Queries, Functions, and Operators
sql_translation.AthenaConnection(con) sql_translate_env.AthenaConnection(con) sql_escape_string.AthenaConnection(con, x)
sql_translation.AthenaConnection(con) sql_translate_env.AthenaConnection(con) sql_escape_string.AthenaConnection(con, x)
con |
An |
x |
An object to escape. Existing sql vectors will be left as is, character vectors are escaped with single quotes, numeric vectors have trailing ‘.0' added if they’re whole numbers, identifiers are escaped with double quotes. |
Creates an interface to compose CREATE EXTERNAL TABLE
.
## S4 method for signature 'AthenaConnection' sqlCreateTable( con, table, fields, field.types = NULL, partition = NULL, s3.location = NULL, file.type = c("tsv", "csv", "parquet", "json"), compress = FALSE, ... )
## S4 method for signature 'AthenaConnection' sqlCreateTable( con, table, fields, field.types = NULL, partition = NULL, s3.location = NULL, file.type = c("tsv", "csv", "parquet", "json"), compress = FALSE, ... )
con |
A database connection. |
table |
The table name, passed on to
|
fields |
Either a character vector or a data frame. A named character vector: Names are column names, values are types.
Names are escaped with A data frame: field types are generated using
|
field.types |
Additional field types used to override derived types. |
partition |
Partition Athena table (needs to be a named list or vector) for example: |
s3.location |
s3 bucket to store Athena table, must be set as a s3 uri for example ("s3://mybucket/data/").
By default s3.location is set s3 staging directory from |
file.type |
What file type to store data.frame on s3, RAthena currently supports ["tsv", "csv", "parquet", "json"]. Default delimited file type is "tsv", in previous versions
of |
compress |
|
... |
Other arguments used by individual methods. |
sqlCreateTable
returns data.frame's DDL
in the SQL
format.
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Create DDL for iris data.frame sqlCreateTable(con, "iris", iris, s3.location = "s3://path/to/athena/table") # Create DDL for iris data.frame with partition sqlCreateTable(con, "iris", iris, partition = "timestamp", s3.location = "s3://path/to/athena/table") # Create DDL for iris data.frame with partition and file.type parquet sqlCreateTable(con, "iris", iris, partition = "timestamp", s3.location = "s3://path/to/athena/table", file.type = "parquet") # Disconnect from Athena dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(DBI) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # Create DDL for iris data.frame sqlCreateTable(con, "iris", iris, s3.location = "s3://path/to/athena/table") # Create DDL for iris data.frame with partition sqlCreateTable(con, "iris", iris, partition = "timestamp", s3.location = "s3://path/to/athena/table") # Create DDL for iris data.frame with partition and file.type parquet sqlCreateTable(con, "iris", iris, partition = "timestamp", s3.location = "s3://path/to/athena/table", file.type = "parquet") # Disconnect from Athena dbDisconnect(con) ## End(Not run)
This method converts data.frame columns into the correct format so that it can be uploaded Athena.
## S4 method for signature 'AthenaConnection' sqlData( con, value, row.names = NA, file.type = c("tsv", "csv", "parquet", "json"), ... )
## S4 method for signature 'AthenaConnection' sqlData( con, value, row.names = NA, file.type = c("tsv", "csv", "parquet", "json"), ... )
con |
A database connection. |
value |
A data frame |
row.names |
Either If A string is equivalent to For backward compatibility, |
file.type |
What file type to store data.frame on s3, RAthena currently supports ["csv", "tsv", "parquet", "json"]. Note: This parameter is used for format any special characters that clash with file type separator. |
... |
Other arguments used by individual methods. |
sqlData
returns a dataframe formatted for Athena. Currently converts list
variable types into character
split by '|'
, similar to how data.table
writes out to files.
Lower level API access, allows user to create and delete Athena Work Groups.
Creates a workgroup with the specified name (link).
The work group utilises parameters from the dbConnect
object, to determine the encryption and output location of the work group.
The s3_staging_dir, encryption_option and kms_key parameters are gotten from dbConnect
Helper function to create tag options for function create_work_group()
Deletes the workgroup with the specified name (link). The primary workgroup cannot be deleted.
Lists available workgroups for the account (link).
Returns information about the workgroup with the specified name (link).
Updates the workgroup with the specified name (link).
The workgroup's name cannot be changed. The work group utilises parameters from the dbConnect
object, to determine the encryption and output location of the work group.
The s3_staging_dir, encryption_option and kms_key parameters are gotten from dbConnect
create_work_group( conn, work_group = NULL, enforce_work_group_config = FALSE, publish_cloud_watch_metrics = FALSE, bytes_scanned_cut_off = 10000000L, requester_pays = FALSE, description = NULL, tags = tag_options(key = NULL, value = NULL) ) tag_options(key = NULL, value = NULL) delete_work_group(conn, work_group = NULL, recursive_delete_option = FALSE) list_work_groups(conn) get_work_group(conn, work_group = NULL) update_work_group( conn, work_group = NULL, remove_output_location = FALSE, enforce_work_group_config = FALSE, publish_cloud_watch_metrics = FALSE, bytes_scanned_cut_off = 10000000L, requester_pays = FALSE, description = NULL, state = c("ENABLED", "DISABLED") )
create_work_group( conn, work_group = NULL, enforce_work_group_config = FALSE, publish_cloud_watch_metrics = FALSE, bytes_scanned_cut_off = 10000000L, requester_pays = FALSE, description = NULL, tags = tag_options(key = NULL, value = NULL) ) tag_options(key = NULL, value = NULL) delete_work_group(conn, work_group = NULL, recursive_delete_option = FALSE) list_work_groups(conn) get_work_group(conn, work_group = NULL) update_work_group( conn, work_group = NULL, remove_output_location = FALSE, enforce_work_group_config = FALSE, publish_cloud_watch_metrics = FALSE, bytes_scanned_cut_off = 10000000L, requester_pays = FALSE, description = NULL, state = c("ENABLED", "DISABLED") )
conn |
A |
work_group |
The Athena workgroup name. |
enforce_work_group_config |
If set to |
publish_cloud_watch_metrics |
Indicates that the Amazon CloudWatch metrics are enabled for the workgroup. |
bytes_scanned_cut_off |
The upper data usage limit (cutoff) for the amount of bytes a single query in a workgroup is allowed to scan. |
requester_pays |
If set to |
description |
The workgroup description. |
tags |
A tag that you can add to a resource. A tag is a label that you assign to an AWS Athena resource (a workgroup).
Each tag consists of a key and an optional value, both of which you define. Tags enable you to categorize workgroups in Athena, for example,
by purpose, owner, or environment. Use a consistent set of tag keys to make it easier to search and filter workgroups in your account.
The maximum tag key length is 128 Unicode characters in UTF-8. The maximum tag value length is 256 Unicode characters in UTF-8.
You can use letters and numbers representable in UTF-8, and the following characters: |
key |
A tag key. The tag key length is from 1 to 128 Unicode characters in UTF-8. You can use letters and numbers representable in UTF-8, and the following characters: |
value |
A tag value. The tag value length is from 0 to 256 Unicode characters in UTF-8. You can use letters and numbers representable in UTF-8, and the following characters: |
recursive_delete_option |
The option to delete the workgroup and its contents even if the workgroup contains any named queries |
remove_output_location |
If set to |
state |
The workgroup state that will be updated for the given workgroup. |
Returns NULL
but invisible
Returns list
but invisible
Returns NULL
but invisible
Returns list of available work groups
Returns list of work group meta data
Returns NULL
but invisible
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(RAthena) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # List current work group available list_work_groups(con) # Create a new work group wg <- create_work_group(con, "demo_work_group", description = "This is a demo work group", tags = tag_options(key= "demo_work_group", value = "demo_01")) # List work groups to see new work group list_work_groups(con) # get meta data from work group wg <- get_work_group(con, "demo_work_group") # Update work group wg <- update_work_group(con, "demo_work_group", description = "This is a demo work group update") # get updated meta data from work group wg <- get_work_group(con, "demo_work_group") # Delete work group delete_work_group(con, "demo_work_group") # Disconect from Athena dbDisconnect(con) ## End(Not run)
## Not run: # Note: # - Require AWS Account to run below example. # - Different connection methods can be used please see `RAthena::dbConnect` documnentation library(RAthena) # Demo connection to Athena using profile name con <- dbConnect(RAthena::athena()) # List current work group available list_work_groups(con) # Create a new work group wg <- create_work_group(con, "demo_work_group", description = "This is a demo work group", tags = tag_options(key= "demo_work_group", value = "demo_01")) # List work groups to see new work group list_work_groups(con) # get meta data from work group wg <- get_work_group(con, "demo_work_group") # Update work group wg <- update_work_group(con, "demo_work_group", description = "This is a demo work group update") # get updated meta data from work group wg <- get_work_group(con, "demo_work_group") # Delete work group delete_work_group(con, "demo_work_group") # Disconect from Athena dbDisconnect(con) ## End(Not run)