Group
Extension

Matches 6

AI-Ollama-Client ( C/CO/CORION/AI-Ollama-Client-0.05.tar.gz, CORION, 2025; MetaCPAN )
AI-Ollama-Client/scripts/code-completion.pl ( view source; MetaCPAN )
package main;
use 5.020;
use Mojo::JSON 'decode_json';
use experimental 'signatures';
use AI::Ollama::Client;
use Future::Utils 'repeat';

#use Getopt::Long;
#GetOptions(
#    'prefix|p=s' => \my $pre
AI-Ollama-Client ( C/CO/CORION/AI-Ollama-Client-0.05.tar.gz, CORION, 2025; MetaCPAN )
AI-Ollama-Client/scripts/music-genre-json.pl ( view source; MetaCPAN )
package main;
use 5.020;
use Mojo::JSON 'decode_json';
use experimental 'signatures';
use AI::Ollama::Client;
use Future::Utils 'repeat';

my $ol = AI::Ollama::Client->new(
    server => 'http://192.1
 stream => JSON::PP::false(),
)->get;
warn "Pulled '$model'";
my @prompts = @ARGV ? @ARGV : (
    qq!Please tell me three musical genres of the song "Go West" by "The Pet Shop Boys" as JSON like ```[{
mer.',
                       'Only list the musical genres.',
                       #'Answer in JSON only with an array containing objects { "genre": "the genre", "sub-genre": "the sub genre" }.',
 
AI-Ollama-Client ( C/CO/CORION/AI-Ollama-Client-0.05.tar.gz, CORION, 2025; MetaCPAN )
AI-Ollama-Client/lib/AI/Ollama/GenerateChatCompletionRequest.pm ( view source; MetaCPAN )
cepted value is json.

Enable JSON mode by setting the format parameter to json. This will structure the response as valid JSON.

Note: it's important to instruct the model to use JSON in the prompt. 
amounts whitespace.

=cut

has 'format' => (
    is       => 'ro',
    isa      => Enum[
        "json",
    ],
);

=head2 C<< keep_alive >>

How long (in minutes) to keep the model loaded in memory.
AI-Ollama-Client ( C/CO/CORION/AI-Ollama-Client-0.05.tar.gz, CORION, 2025; MetaCPAN )
AI-Ollama-Client/lib/AI/Ollama/GenerateCompletionRequest.pm ( view source; MetaCPAN )
cepted value is json.

Enable JSON mode by setting the format parameter to json. This will structure the response as valid JSON.

Note: it's important to instruct the model to use JSON in the prompt. 
amounts whitespace.

=cut

has 'format' => (
    is       => 'ro',
    isa      => Enum[
        "json",
    ],
);

=head2 C<< images >>

(optional) a list of Base64-encoded images to include in the m
AI-Ollama-Client ( C/CO/CORION/AI-Ollama-Client-0.05.tar.gz, CORION, 2025; MetaCPAN )
AI-Ollama-Client/lib/AI/Ollama/Client/Impl.pm ( view source; MetaCPAN )
Role
use YAML::PP;
use Mojo::UserAgent;
use Mojo::URL;
use URI::Template;
use Mojo::JSON 'encode_json', 'decode_json';
use OpenAPI::Modern;

use File::ShareDir 'module_file';

use Future::Mojo;
use Fu
fault => sub {
        if( my $fn = $_[0]->schema_file ) {
            YAML::PP->new( boolean => 'JSON::PP' )->load_file($fn);
        }
    },
);

has 'validate_requests' => (
    is => 'rw',
    def
cepted value is json.

Enable JSON mode by setting the format parameter to json. This will structure the response as valid JSON.

Note: it's important to instruct the model to use JSON in the prompt. 
AI-Ollama-Client ( C/CO/CORION/AI-Ollama-Client-0.05.tar.gz, CORION, 2025; MetaCPAN )
AI-Ollama-Client/scripts/describe-image.pl ( view source; MetaCPAN )
package main;
use 5.020;
use Mojo::JSON 'decode_json';
use experimental 'signatures';
use AI::Ollama::Client;
use Future::Utils 'repeat';

my $ol = AI::Ollama::Client->new(
    server => 'http://192.1

Powered by Groonga
Maintained by Kenichi Ishigaki <ishigaki@cpan.org>. If you find anything, submit it on GitHub.