diff options
| author | mo khan <mo@mokhan.ca> | 2025-10-08 10:42:46 -0600 |
|---|---|---|
| committer | mo khan <mo@mokhan.ca> | 2025-10-08 10:42:46 -0600 |
| commit | 4072dedb8f17040f3dbe1ec8400e2eb87b044c2d (patch) | |
| tree | cd7f30fc71793a914e6d9c4d3d3878b51200d064 /README.md | |
| parent | 58d4e9be69cf5fbc951bc63858ac899dded47bee (diff) | |
feat: add support for streaming responses
Diffstat (limited to 'README.md')
| -rw-r--r-- | README.md | 34 |
1 files changed, 34 insertions, 0 deletions
@@ -90,6 +90,40 @@ headers = { 'Authorization' => Net::Hippie.bearer_auth('token') } Net::Hippie.get('https://www.example.org', headers: headers) ``` +### Streaming Responses + +Net::Hippie supports streaming responses by passing a block that accepts the response object: + +```ruby +Net::Hippie.post('https://api.example.com/stream', body: { prompt: 'Hello' }) do |response| + response.read_body do |chunk| + print chunk + end +end +``` + +This is useful for Server-Sent Events (SSE) or other streaming APIs: + +```ruby +client = Net::Hippie::Client.new +client.post('https://api.openai.com/v1/chat/completions', + headers: { + 'Authorization' => Net::Hippie.bearer_auth(ENV['OPENAI_API_KEY']), + 'Content-Type' => 'application/json' + }, + body: { model: 'gpt-4', messages: [{ role: 'user', content: 'Hi' }], stream: true } +) do |response| + buffer = "" + response.read_body do |chunk| + buffer += chunk + while (line = buffer.slice!(/.*\n/)) + next if line.strip.empty? + puts line if line.start_with?('data: ') + end + end +end +``` + ## Development After checking out the repo, run `bin/setup` to install dependencies. Then, run `bin/test` to run the tests. |
