uWSGI的各種部署方式


As a back-end with web-server, speak the uwsgi protocol

<uwsgi id = "uwsgibk">
    <stats>127.0.0.1:9090</stats>
    <socket>127.0.0.1:3030</socket>
    <file>./server.py</file>
    <enable-threads/>
    <post-buffering/>
    <memory-report/>
</uwsgi>

As a back-end with web-server, speak the http protocol

The http and http-socket options are entirely different beasts. The first one spawns an additional process forwarding requests to a series of workers (think about it as a form of shield, at the same level of apache or nginx), while the second one sets workers to natively speak the http protocol. TL/DR: if you plan to expose uWSGI directly to the public, use –http, if you want to proxy it behind a webserver speaking http with backends, use –http-socket. .. seealso:: Native HTTP support

<uwsgi id = "httpbk">
    <stats>127.0.0.1:9090</stats>
    <http-socket>127.0.0.1:3030</http-socket>
    <file>./server.py</file>
    <enable-threads/>
    <post-buffering/>
    <memory-report/>
</uwsgi>

Expose self directly to the public

The http and http-socket options are entirely different beasts. The first one spawns an additional process forwarding requests to a series of workers (think about it as a form of shield, at the same level of apache or nginx), while the second one sets workers to natively speak the http protocol. TL/DR: if you plan to expose uWSGI directly to the public, use –http, if you want to proxy it behind a webserver speaking http with backends, use –http-socket. .. seealso:: Native HTTP support

<uwsgi id = "http">
    <stats>127.0.0.1:9090</stats>
    <http>:80</http>
    <file>./server.py</file>
    <enable-threads/>
    <post-buffering/>
    <memory-report/>
</uwsgi>

Nginx with The uWSGI FastRouter

http://uwsgi-docs.readthedocs.org/en/latest/Nginx.html
http://uwsgi-docs.readthedocs.org/en/latest/Fastrouter.html
http://stackoverflow.com/questions/21518533/putting-a-uwsgi-fast-router-in-front-of-uwsgi-servers-running-in-docker-containe
http://stackoverflow.com/questions/26499644/how-to-use-the-uwsgi-fastrouter-whith-nginx

Configurations of Nginx

location /test {
        include    uwsgi_params;
        uwsgi_pass 127.0.0.1:3030;
    }

Configurations of FastRouter

<uwsgi id = "fastrouter">
    <fastrouter>127.0.0.1:3030</fastrouter>
    <fastrouter-subscription-server>127.0.0.1:3131</fastrouter-subscription-server>
    <enable-threads/>
    <master/>
    <fastrouter-stats>127.0.0.1:9595</fastrouter-stats>
</uwsgi>

 

Configurations of instance

<uwsgi id = "subserver1">
    <stats>127.0.0.1:9393</stats>
    <processes>4</processes>
    <enable-threads/>
    <memory-report/>
    <subscribe-to>127.0.0.1:3131:[server_ip] or [domain]</subscribe-to>
    <socket>127.0.0.1:3232</socket>
    <file>./server.py</file>
    <master/>
    <weight>8</weight>
</uwsgi>
<uwsgi id = "subserver2">
    <stats>127.0.0.1:9494</stats>
    <processes>4</processes>
    <enable-threads/>
    <memory-report/>
    <subscribe-to>127.0.0.1:3131:[server_ip] or [domain]</subscribe-to>
    <socket>127.0.0.1:3333</socket>
    <file>./server.py</file>
    <master/>
    <weight>2</weight>
</uwsgi>

If we HTTP-GET [server_ip] or [domain]/test, the route of request as follows:
Nginx >> FastRouter(port 3030) >> FastRouter(port 3131) >> subserver1(port 3232) or subserver2(port 3333)
and the protocol of all is uwsgi.

Faster Router also has specific stats.

http router

http://uwsgi-docs.readthedocs.org/en/latest/HTTP.html
Router

<uwsgi id = "httprouter">
    <enable-threads/>
    <master/>
    <http>:8080</http>
    <http-stats>127.0.0.1:9090</http-stats>
    <http-to>127.0.0.1:8181</http-to>
    <http-to>127.0.0.1:8282</http-to>
</uwsgi>

sub-server1

<uwsgi id = "httpserver1">
    <stats>127.0.0.1:9191</stats>
    <socket>127.0.0.1:8181</socket>
    <memory-report/>
    <file>./server.py</file>
    <enable-threads/>
    <post-buffering/>
</uwsgi>

sub-server2

<uwsgi id = "httpserver2">
    <stats>127.0.0.1:9292</stats>
    <memory-report/>
    <file>./server.py</file>
    <socket>127.0.0.1:8282</socket>
    <enable-threads/>
    <post-buffering/>
</uwsgi>

Other route mode

Just like Fast Route and Http Route, the router can forward(http-to) or route requests to other sub-servers, the only difference is protocol.
The protocols as follows:
Fast, HTTP(S), raw, ssl.
Certainly, all of routers have specific stats in the same style.

The uWSGI Emperor

http://uwsgi-docs.readthedocs.org/en/latest/Emperor.html
Emperor

<uwsgi id = "emperor">
    <emperor>./vassals</emperor>
    <emperor-stats-server>127.0.0.1:9090</emperor-stats-server>
</uwsgi>

Vassal1

<vassal1>
    <uwsgi id = "vassal1">
        <http>:8080</http>
        <stats>127.0.0.1:9191</stats>
        <memory-report/>
        <enable-threads/>
        <post-buffering/>
        <file>./server.py</file>
        <chdir>..</chdir>
    </uwsgi>
</vassal1>

Vassal2

<vassal2>
    <uwsgi id = "vassal2">
        <http>:8181</http>
        <stats>127.0.0.1:9292</stats>
        <memory-report/>
        <enable-threads/>
        <post-buffering/>
        <file>./server.py</file>
        <chdir>..</chdir>
    </uwsgi>
</vassal2>

Emperor also has specific stats.

multi-mountpoint

<uwsgi id = "vassal1">
    <socket>127.0.0.1:3030</socket>
    <http>:8080</http>
    <stats>127.0.0.1:9191</stats>
    <memory-report/>
    <enable-threads/>
    <post-buffering/>
    <manage-script-name/>
    <chdir>..</chdir>
    <mount>/pic=server.py</mount>
    <mount>/test=fuck.py</mount>
    <workers>2</workers>
</uwsgi>

注意:

http://stackoverflow.com/questions/19475651/how-to-mount-django-app-with-uwsgi

<manage-script-name/>參數很關鍵,但官方help中並沒有提這個,F*CK YOU!

這樣配置后,uWSGI會根據不同的request路徑調用不同的app。

The End

 uWSGI作為backend,和Nginx的配合比Apache要好,建議用Nginx作前端。

如果在CentOS上以INET socket為載體部署Nginx+uWSGI或Apache+uWSGI時,客戶端訪問遇到50x Bad Gateway錯誤時,強烈建議你換到Ubuntu上去試試,因為同樣的配置在CentOS上有這個問題,換到Ubuntu后就可能沒問題。

2014-11-03補充:CentOS下的這個問題的原因是RedHat系列的Linux發行版默認開啟SELinux,SELinux的某些策略導致的這個問題,不知道算不算是BUG。關閉了SELinux就可解決,具體如何關閉可以參考:http://www.cnblogs.com/lightnear/archive/2012/10/06/2713090.html。Debian系列的發行版因為默認沒有開啟SELinux,所以不存在這個問題。上周末看到一篇比較Liunx發行版性能的文章時,偶然發現的有SELinux這個東西,今天一試果然。如果再碰到其他類似的詭異問題,也可以嘗試關閉SELinux試試。

如果用Apache+uWSGI部署后,客戶端訪問時返回一個文件,那有可能是Apache對頁面壓縮后的結果,把Apache的GZIP壓縮模塊關掉就好了。Nginx沒有這個問題,更好用不是嗎?

忍不住吐槽

尼瑪uWSGI的文檔就是一坨SHIT,SHIT!國內這方面的知識可能少很多,在這里貼出來,方便國內的用戶。

像這種小眾一點的東西,國內資料很少,跟國外比還是有差距啊,基友們還需努力啊。


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM