Linux環境下,一般都是使用vsql客戶端連接vertica來執行命令的,下面就介紹一下vsql常用的命令有哪些。
導出數據
/opt/vertica/bin/vsql -U $usr -p 5433 -h 172.1.1.1 -w $pwd -At -o /home/qincf/20160809.dat -c "select * from tb_test;"
默認是以‘|’分割的,指定分隔符如下:
/opt/vertica/bin/vsql -U $usr -p 5433 -h 172.1.1.1 -w $pwd -F $'\t' -At -o /home/qincf/20160809.dat -c "select * from tb_test;"
切換路徑 :\cd
dbadmin=> \!pwd
/home/qincf
dbadmin=> \cd /tmp
dbadmin=> \!pwd
/tmp
列出多有表:\d
列出多有函數:\df
列出所有projection:\dj
列出所有的schema:\dn
列出所有的序列:\ds
列出所有的系統字典表:\dS
列出所有支持的類型:\dT
列出所有的視圖:\dv
編輯sql:\e
此時會進入編輯模式,輸入需要執行的sql腳本,然后保存,就可以執行了(可以同時執行多個sql語句)
執行緩存的sql:\g
輸出HTML格式的結果:\H
dbadmin=> \H
Output format is html.
dbadmin=> select * from nodes limit 1;
<table border="1">
<tr>
<th align="center">node_name</th>
<th align="center">node_id</th>
<th align="center">node_state</th>
<th align="center">node_address</th>
<th align="center">node_address_family</th>
<th align="center">export_address</th>
<th align="center">export_address_family</th>
<th align="center">catalog_path</th>
<th align="center">node_type</th>
<th align="center">is_ephemeral</th>
<th align="center">standing_in_for</th>
<th align="center">node_down_since</th>
</tr>
<tr valign="top">
<td align="left">v_qcf_node0001</td>
<td align="right">45035996273704980</td>
<td align="left">UP</td>
<td align="left">172.1.1.1</td>
<td align="left">ipv4</td>
<td align="left">172.1.1.1</td>
<td align="left">ipv4</td>
<td align="left">/data/qincf/v_qincf_node0001_catalog/Catalog</td>
<td align="left">PERMANENT</td>
<td align="left">f</td>
<td align="left"> </td>
<td align="left"> </td>
</tr>
</table>
<p>(1 row)<br />
</p>
查看當前緩存buffer里的內容:\p
dbadmin=> \p
select * from nodes limit 1;
修改密碼:\password [ USER ]
dbadmin=> \password test
Changing password for "test"
New password:
情況當前buffer:\r
dbadmin=> \r
Query buffer reset (cleared).
dbadmin=> \p
Query buffer is empty.
歷史命令查看保存為file:\s [ FILE ]
\s history.log
查看所有表的權限:\dp 或者\z
copy:
vsql -U username -w passwd -d vmart -c "COPY store.store_sales_fact FROM STDIN DELIMITER '|';"
直接copyHDFS的文件到vertica
COPY testTable SOURCE Hdfs(url='http://hadoop:50070/webhdfs/v1/tmp/test.txt',
username='hadoopUser');